2014-11-26 3 views
0

Я пытался создать полный исходный код Hadoop на моем mac. Когда я пытаюсь построить, я постоянно сталкиваюсь с ошибкой: «java.lang.OutOfMemoryError: Java heap space». Я попытался установить maven opts, используя команду: export MAVEN_OPTS = "- Xms256m -Xmx1024m" и export MAVEN_OPTS = "- Xmx2g -XX: MaxPermSize = 512M -XX: ReservedCodeCacheSize = 512m". Но ни одно из этих решений не помогло мне. У меня 4 ГБ DDR3. Процессор Intel Core i5 с тактовой частотой 2,3 ГГц. Пожалуйста, предоставьте мне правильное направление. Журналы:Hadoop 2.4 maven build OutOfMemoryError: Java heap space

[INFO] ------------------------------------------------------------------------ 
[INFO] Building Apache Hadoop HDFS 2.4.0 
[INFO] ------------------------------------------------------------------------ 
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs --- 
[INFO] Deleting /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target 
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs --- 
[INFO] Executing tasks 

main: 
    [mkdir] Created dir: /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/test-dir 
    [mkdir] Created dir: /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/test/data 
[INFO] Executed tasks 
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-jsp-generated-sources-directory) @ hadoop-hdfs --- 
[INFO] Executing tasks 

main: 
    [mkdir] Created dir: /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java 
[INFO] Executed tasks 
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (hdfs) @ hadoop-hdfs --- 
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. 
Created dir: /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/classes 
[INFO] Compiling 8 JSP source files to /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java 
[INFO] Built File: /block_info_xml.jsp 
[INFO] Built File: /corrupt_files.jsp 
[INFO] Built File: /corrupt_replicas_xml.jsp 
[INFO] Built File: /decommission.jsp 
[INFO] Built File: /dfsclusterhealth.jsp 
[INFO] Built File: /dfshealth.jsp 
[INFO] Built File: /dfsnodelist.jsp 
[INFO] Built File: /nn_browsedfscontent.jsp 
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. 
WARN: Please see http://www.slf4j.org/codes.html for an explanation. 
[INFO] Compiled completed in 0:00:00.579 
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (secondary) @ hadoop-hdfs --- 
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. 
[INFO] Compiling 1 JSP source file to /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java 
[INFO] Built File: /status.jsp 
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. 
WARN: Please see http://www.slf4j.org/codes.html for an explanation. 
[INFO] Compiled completed in 0:00:00.041 
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (journal) @ hadoop-hdfs --- 
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. 
[INFO] Compiling 1 JSP source file to /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java 
[INFO] Built File: /journalstatus.jsp 
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. 
WARN: Please see http://www.slf4j.org/codes.html for an explanation. 
[INFO] Compiled completed in 0:00:00.041 
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (datanode) @ hadoop-hdfs --- 
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. 
[INFO] Compiling 4 JSP source files to /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java 
[INFO] Built File: /browseBlock.jsp 
[INFO] Built File: /browseDirectory.jsp 
[INFO] Built File: /dataNodeHome.jsp 
[INFO] Built File: /tail.jsp 
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. 
WARN: Please see http://www.slf4j.org/codes.html for an explanation. 
[INFO] Compiled completed in 0:00:00.073 
[INFO] 
[INFO] --- build-helper-maven-plugin:1.5:add-source (add-jsp-generated-sources-directory) @ hadoop-hdfs --- 
[INFO] Source directory: /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java added. 
[INFO] 
[INFO] --- hadoop-maven-plugins:2.4.0:protoc (compile-protoc) @ hadoop-hdfs --- 
[WARNING] [protoc, --version] failed with error code 1 
[INFO] 
[INFO] --- hadoop-maven-plugins:2.4.0:protoc (compile-protoc-datanode) @ hadoop-hdfs --- 
[WARNING] [protoc, --version] failed with error code 1 
[INFO] 
[INFO] --- hadoop-maven-plugins:2.4.0:protoc (compile-protoc-namenode) @ hadoop-hdfs --- 
[WARNING] [protoc, --version] failed with error code 1 
[INFO] 
[INFO] --- hadoop-maven-plugins:2.4.0:protoc (compile-protoc-qjournal) @ hadoop-hdfs --- 
[WARNING] [protoc, --version] failed with error code 1 
[INFO] 
[INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-hdfs --- 
[INFO] Using default encoding to copy filtered resources. 
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-hdfs --- 
[INFO] Compiling 587 source files to /Users/harshavyaspalli/Sachin/hadoop/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/classes 
[INFO] ------------------------------------------------------------- 
[ERROR] COMPILATION ERROR : 
[INFO] ------------------------------------------------------------- 
[ERROR] Failure executing javac, but could not parse the error: 


The system is out of resources. 
Consult the following stack trace for details. 
java.lang.OutOfMemoryError: Java heap space 
    at com.sun.tools.javac.util.Position$LineMapImpl.build(Position.java:139) 
    at com.sun.tools.javac.util.Position.makeLineMap(Position.java:63) 
    at com.sun.tools.javac.parser.Scanner.getLineMap(Scanner.java:1113) 
    at com.sun.tools.javac.main.JavaCompiler.parse(JavaCompiler.java:512) 
    at com.sun.tools.javac.main.JavaCompiler.parse(JavaCompiler.java:550) 
    at com.sun.tools.javac.main.JavaCompiler.parseFiles(JavaCompiler.java:804) 
    at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:727) 
    at com.sun.tools.javac.main.Main.compile(Main.java:353) 
    at com.sun.tools.javac.main.Main.compile(Main.java:279) 
    at com.sun.tools.javac.main.Main.compile(Main.java:270) 
    at com.sun.tools.javac.Main.compile(Main.java:87) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) 
    at java.lang.reflect.Method.invoke(Method.java:597) 
    at org.codehaus.plexus.compiler.javac.JavacCompiler.compileInProcess0(JavacCompiler.java:551) 
    at org.codehaus.plexus.compiler.javac.JavacCompiler.compileInProcess(JavacCompiler.java:526) 
    at org.codehaus.plexus.compiler.javac.JavacCompiler.compile(JavacCompiler.java:167) 
    at org.apache.maven.plugin.AbstractCompilerMojo.execute(AbstractCompilerMojo.java:678) 
    at org.apache.maven.plugin.CompilerMojo.execute(CompilerMojo.java:128) 
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:133) 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208) 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) 
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:108) 
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:76) 
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51) 
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:116) 
    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:361) 
    at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155) 
    at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584) 
    at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213) 

[INFO] 1 error 
[INFO] ------------------------------------------------------------- 
[INFO] ------------------------------------------------------------------------ 
[INFO] Reactor Summary: 
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [ 3.836 s] 
[INFO] Apache Hadoop Project POM ......................... SUCCESS [ 1.634 s] 
[INFO] Apache Hadoop Annotations ......................... SUCCESS [ 3.661 s] 
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [ 0.441 s] 
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [ 2.310 s] 
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [ 4.806 s] 
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [ 4.093 s] 
[INFO] Apache Hadoop Auth ................................ SUCCESS [ 4.638 s] 
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [ 2.795 s] 
[INFO] Apache Hadoop Common .............................. SUCCESS [01:38 min] 
[INFO] Apache Hadoop NFS ................................. SUCCESS [ 11.257 s] 
[INFO] Apache Hadoop Common Project ...................... SUCCESS [ 0.051 s] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [ 15.643 s] 
[INFO] Apache Hadoop HttpFS .............................. SKIPPED 
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED 

Спасибо и наилучшими пожеланиями.

+0

Меня столкнулись с той же проблемой, но она была решена с помощью этой команды экспорта MAVEN_OPTS = "- Xmx512m -XX: MaxPermSize = 128m" экспорт JVM_ARGS = "- XX: PermSize = 64M -XX: MaxPermSize = 256m " – MarHserus

ответ

2

Эта ошибка возникает при Maven плагин компилятора, Отредактируйте файл POM вы строите, Под атрибутом Maven-компилятора-плагина добавить ниже конфигурации

<configuration> 
<verbose>true</verbose> 
<fork>true</fork> 
</configuration> 

Это очистит пространство кучи ошибок.

+0

Привет @bigdatauser, в котором файл appoop pom добавляет вышеприведенную конфигурацию. –

+0

В каталоге, откуда вы строите, т.е. исходный каталог Hadoop – bigdatauser

+0

Это сработало для меня, когда я попал в ту же проблему с памятью кучи java-памяти, создав hadoop 2.7.1 для Raspberry Pi 2 B. Спасибо – Blake

0

Для пользователей mac/linux просто добавьте инструкцию экспорта в ваш ~/.profile (или подобное имя файла). Например:

export MAVEN_OPTS="-Xmx2048m -XX:MaxPermSize=512m" 

и перезагрузить оболочку. Работал для меня

+0

Привет @Kumar, спасибо за быстрый ответ. Я попытался установить MAVEN_OPTS в .bash_profile на mac. Но я все равно получаю ту же ошибку. Когда я пытаюсь построить проект hasoop-hdfs отдельно, он преуспевает. Мое намерение состоит в том, чтобы понять стратегию размещения блоков Hadoops, мне нужно построить весь проект? –

Смежные вопросы