2014-05-10 68 views
1

我在第二次發佈類似問題,因爲我相信我現在對該問題有了更準確的看法。RHadoop作業在單節點Ubuntu羣集上失敗

環境:Hadoop 2.2.0作爲Ubuntu 14.04筆記本電腦上的單節點羣集運行。 RStudio版本0.98.507,R版本3.0.2(2013-09-25),Java版本1.7.0_55

任何R(或Python)程序都可以與位於/ usr/local/hadoop220的Hadoop Streaming實用程序/share/hadoop/tools/lib/hadoop-streaming-2.2.0.jar

在RStudio中運行的R程序中使用軟件包「rmr」(RHadoop的一部分)並調用mapreduce()時出現錯誤。

爲了簡化這個職位,我表現出很簡單的程序失敗(其它更大的計劃失敗,相同的錯誤消息)

Sys.setenv(HADOOP_CMD="/usr/local/hadoop220/bin/hadoop") 
Sys.setenv(HADOOP_STREAMING="/usr/local/hadoop220/share/hadoop/tools/lib/hadoop-streaming-2.2.0.jar") 
library('rhdfs') 
library('rmr2') 
hdfs.init() 
hdfs.ls("/user/hduser") 
small.ints = to.dfs(1:1000) 
mapreduce(
    input = small.ints, 
    map = function(k, v) cbind(v, v^2)) 

,顯示了R-Studio控制檯上的錯誤是

> Sys.setenv(HADOOP_CMD="/usr/local/hadoop220/bin/hadoop") 
> Sys.setenv(HADOOP_STREAMING="/usr/local/hadoop220/share/hadoop/tools/lib/hadoop-streaming-2.2.0.jar") 
> library('rhdfs') 
Loading required package: rJava 

HADOOP_CMD=/usr/local/hadoop220/bin/hadoop 

Be sure to run hdfs.init() 
> library('rmr2') 
Loading required package: Rcpp 
Loading required package: RJSONIO 
Loading required package: bitops 
Loading required package: digest 
Loading required package: functional 
Loading required package: reshape2 
Loading required package: stringr 
Loading required package: plyr 
Loading required package: caTools 
> hdfs.init() 
14/05/10 14:20:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
> hdfs.ls("/user/hduser") 
    permission owner  group size   modtime     file 
1 drwxr-xr-x hduser supergroup 0 2014-05-07 17:44  /user/hduser/BT 
2 drwxr-xr-x hduser supergroup 0 2014-05-09 07:14 /user/hduser/BT-out 
3 drwxr-xr-x hduser supergroup 0 2014-05-09 20:30 /user/hduser/BTR-out 
4 drwxr-xr-x hduser supergroup 0 2014-05-07 17:44 /user/hduser/BTj-in 
> small.ints = to.dfs(1:1000) 
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /usr/local/hadoop220/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. 
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. 
14/05/10 14:20:50 WARN util.NativeCodeLoader: ... using builtin-java classes where applicable 

[ these two messages repeat multiple times ] 

> mapreduce(
+ input = small.ints, 
+ map = function(k, v) cbind(v, v^2)) 

Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /usr/local/hadoop220/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. 
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. 
14/05/10 14:21:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 

14/05/10 14:21:20 WARN streaming.StreamJob: -file option is deprecated, please use generic option -files instead. 

packageJobJar: [/tmp/RtmpYCerEW/rmr-local-env282d4c7a3b53, /tmp/RtmpYCerEW/rmr-global-env282d77c9da92, /tmp/RtmpYCerEW/rmr-streaming-map282d4225651a, /tmp/hadoop-hduser/hadoop-unjar678942474363050554/] [] /tmp/streamjob8073315154972274831.jar tmpDir=null 
14/05/10 14:21:21 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 
14/05/10 14:21:21 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 
14/05/10 14:21:22 INFO mapred.FileInputFormat: Total input paths to process : 1 
14/05/10 14:21:22 INFO mapreduce.JobSubmitter: number of splits:2 
14/05/10 14:21:22 INFO Configuration.deprecation: user.name is deprecated. Instead, use mapreduce.job.user.name 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.cache.files.filesizes is deprecated. Instead, use mapreduce.job.cache.files.filesizes 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class 
14/05/10 14:21:22 INFO Configuration.deprecation: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir 
14/05/10 14:21:23 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1399709731242_0003 
14/05/10 14:21:23 INFO impl.YarnClientImpl: Submitted application application_1399709731242_0003 to ResourceManager at /0.0.0.0:8032 
14/05/10 14:21:23 INFO mapreduce.Job: The url to track the job: http://yantrajaal:8088/proxy/application_1399709731242_0003/ 
14/05/10 14:21:23 INFO mapreduce.Job: Running job: job_1399709731242_0003 
14/05/10 14:21:30 INFO mapreduce.Job: Job job_1399709731242_0003 running in uber mode : false 
14/05/10 14:21:30 INFO mapreduce.Job: map 0% reduce 0% 
14/05/10 14:21:43 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000000_0, Status : FAILED 
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) 
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) 
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130) 
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61) 
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) 

14/05/10 14:21:44 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000001_0, Status : FAILED 
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) 
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) 
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130) 
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61) 
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) 

14/05/10 14:22:04 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000000_1, Status : FAILED 
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) 
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) 
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130) 
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61) 
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) 

Container killed by the ApplicationMaster. 
Container killed on request. Exit code is 143 

14/05/10 14:22:04 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000001_1, Status : FAILED 
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) 
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) 
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130) 
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61) 
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) 

14/05/10 14:22:17 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000001_2, Status : FAILED 
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) 
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) 
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130) 
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61) 
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) 

14/05/10 14:22:17 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000000_2, Status : FAILED 
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) 
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) 
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130) 
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61) 
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) 

14/05/10 14:22:26 INFO mapreduce.Job: map 100% reduce 0% 
14/05/10 14:22:26 INFO mapreduce.Job: Job job_1399709731242_0003 failed with state FAILED due to: Task failed task_1399709731242_0003_m_000001 
Job failed as tasks failed. failedMaps:1 failedReduces:0 

14/05/10 14:22:26 INFO mapreduce.Job: Counters: 10 
    Job Counters 
     Failed map tasks=7 
     Killed map tasks=1 
     Launched map tasks=8 
     Other local map tasks=6 
     Data-local map tasks=2 
     Total time spent by all maps in occupied slots (ms)=91997 
     Total time spent by all reduces in occupied slots (ms)=0 
    Map-Reduce Framework 
     CPU time spent (ms)=0 
     Physical memory (bytes) snapshot=0 
     Virtual memory (bytes) snapshot=0 
14/05/10 14:22:26 ERROR streaming.StreamJob: Job not Successful! 
Streaming Command Failed! 
Error in mr(map = map, reduce = reduce, combine = combine, vectorized.reduce, : 
    hadoop streaming failed with error code 1 
> 

我用Google搜索兩個刺激性警告 (一)禁用堆棧後衛和this link發現,有「沒什麼好擔心的」只是一個警告 (b)無法加載吶tive-hadoop圖書館爲您的平臺...使用內置java類的地方應用..這也是一個警告根據this link沒有什麼可擔心的

折扣這兩個警告作爲不是原因後,主要錯誤我覺得在這裏

14/05/10 14:21:43 INFO mapreduce.Job: Task Id : attempt_1399709731242_0003_m_000000_0, Status : FAILED 
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) 
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) 
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130) 
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61) 
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) 

我重新安裝了RHadoop包,RMR和rhdfs,並已被重新安裝rJava。在過去也嘗試過使用Hadoop 1.3,但錯誤是一樣的。

會很感激,如果有人能提出今後某種方式對這個

+0

即使我得到相同的錯誤。如果您有任何解決方案,請分享。 –

回答

1

我通過改變安裝rmr2,rhdfs,...包的目錄解決了這個問題。 基本上,您需要將所有軟件包安裝在系統文件夾中而不是自定義文件夾中。 安裝位置似乎存在問題。起初我安裝在一個自定義文件夾的包:

/home/user/R/3.1 

通過重新安裝軟件包在:

/usr/lib/R/library 

我得到的代碼工作。

我希望這會有所幫助。

+1

是的!此解決方案有效。但是您需要知道R默認庫的位置。例如,在我的情況下,它是/ usr/lib64/R/library。瞭解各種R模塊的位置,請參閱此鏈接https://estrip.org/articles/read/tinypliny/55510/Where_is_R_installed_on_Linux_.html – Calcutta

相關問題