2017-03-16 55 views
1

我試圖與Windows 10建立開發計算機上的Hadoop以下版本的家庭版無法執行目標maven maven-antrun-plugin:1.7項目hadoop-hdfs:圍繞Ant部分的Ant BuildException hadoop-hdfsproject hadoop-hdfs:?

的Hadoop-2.7.3-SRC

這裏有一些關於我的地方發展環境的細節:

- 10的Windows家庭版

- 英特爾酷睿i5-6200U CPU @ 2.30GHz

-RAM 16 GB

-64位操作系統,基於x64的處理器

- 微軟的Visual Studio 2015年的社區版本14.0.25431.01更新3

-.NET框架4.6.01586

-cmake版本3.7.2

-CYGWIN_NT-10.0 LTPBCV82DUG 2.7.0(0.306/5/3)2017年2月12日13點18 x86_64的Cygwin的

-java版本 「1.8.0_121」

-Java(TM)SE運行時環境(建立1.8.0_121-B13)

-Java熱點(TM)64位服務器VM(生成25.121-B13,混合模式)

-Apache Maven的3.3 .9(bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-10T11:41:47-05:00)

-Google Protocol Buffers的protoc --version libprotoc 2.5.0

我打開了開發人員命令提示符爲Visual Studio 2015年(VS2015)

C:\ Hadoop的\ Hadoop的2.7.3-SRC> MVN包-Pdist,原生共贏-DskipTests -Dtar -X

不幸的是,我發現了以下錯誤:

[INFO] ------------------------------------------------------------------------ 
[INFO] BUILD FAILURE 
[INFO] ------------------------------------------------------------------------ 
[INFO] Total time: 06:27 min 
[INFO] Finished at: 2017-03-15T19:26:50-04:00 
[INFO] Final Memory: 102M/1591M 
[INFO] ------------------------------------------------------------------------ 
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 
[ERROR] around Ant part ...<exec failonerror="true" dir="C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... @ 8:126 in C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml 
[ERROR] -> [Help 1] 
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 
around Ant part ...<exec failonerror="true" dir="C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... @ 8:126 in C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml 
     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212) 
     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) 
     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) 
     at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116) 
     at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80) 
     at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51) 
     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128) 
     at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307) 
     at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193) 
     at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106) 
     at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863) 
     at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288) 
     at org.apache.maven.cli.MavenCli.main(MavenCli.java:199) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:498) 
     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289) 
     at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229) 
     at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415) 
     at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356) 
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 1 
around Ant part ...<exec failonerror="true" dir="C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... @ 8:126 in C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml 
     at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:355) 
     at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134) 
     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207) 
     ... 20 more 
Caused by: C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml:8: exec returned: 1 
     at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:646) 
     at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:672) 
     at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:498) 
     at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291) 
     at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:498) 
     at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) 
     at org.apache.tools.ant.Task.perform(Task.java:348) 
     at org.apache.tools.ant.Target.execute(Target.java:390) 
     at org.apache.tools.ant.Target.performTasks(Target.java:411) 
     at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399) 
     at org.apache.tools.ant.Project.executeTarget(Project.java:1368) 
     at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:327) 
     ... 22 more 
[ERROR] 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles: 
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException 
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command 
[ERROR] mvn <goals> -rf :hadoop-hdfs 

回答

0

我得到了同樣的錯誤或在CentOS 7.2,而Hadoop的共同點是建設 走後我做後 須藤蔭-y安裝zlib的 須藤蔭-y安裝的zlib-devel的

後,我得到了一個又一個在Hadoop的管道模塊,我做 sudo yum -y install -y openssl-devel

併成功與我的構建。

希望它會有用。