2012-06-25 47 views
0

運行蒙戈 - Hadoop的流時,我得到了以下錯誤:蒙戈-的hadoop streaming mapper.py沒有找到

java.io.IOException: Cannot run program "mapper.py": error=2, No such file or directory 
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:460) 
    at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214) 
    at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) 
    at java.lang.reflect.Method.invoke(Method.java:597) 
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88) 
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64) 
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) 
    at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) 
    at java.lang.reflect.Method.invoke(Method.java:597) 
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88) 
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64) 
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:387) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325) 
    at org.apache.hadoop.mapred.Child$4.run(Child.java:270) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:396) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1157) 
    at org.apache.hadoop.mapred.Child.main(Child.java:264) 
Caused by: java.io.IOException: error=2, No such file or directory 
    at java.lang.UNIXProcess.forkAndExec(Native Method) 
    at java.lang.UNIXProcess.<init>(UNIXProcess.java:53) 
    at java.lang.ProcessImpl.start(ProcessImpl.java:91) 
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:453) 
    ... 24 more 

我有運行標準的Hadoop蟒蛇通過小飛或正規途徑流沒有問題。

在Hadoop中的python流在另一個post

提到這個錯誤我米運行作業是這樣的:

hadoop jar /Volumes/Locodrive/hadoop/mongo-hadoop/streaming/target/mongo-hadoop-streaming-assembly-1.1.0-SNAPSHOT.jar -mapper mapper.py -file mapper.py -reducer reducer.py -file reducer.py -inputURI mongodb://localhost:27017/testdb.docs -outputURI mongodb://localhost:27017/testdb.testhadoop 

使用上mapper.py/reducer.py路徑/絕對路徑,加入abolute路徑中的文件參數沒有幫助。標準的Hadoop流媒體工作沒有任何問題,所以我沒有得到錯誤。

將hdfs添加到mapper.pyreducer.py並沒有幫助。

mapper.pyreducer.py是execuatable,做有在第一線家當:

mapper.py

#!/usr/bin/env python 

import sys 
sys.path.append(".") 


from pymongo_hadoop import BSONMapper 


def mapper(documents): 

    i = 0 
    for doc in documents: 
     i += 1 
     yield {"_id": "test", "count": 1} 


BSONMapper(mapper) 
print >> sys.stderr, "Done Mapping!!!" 

reducer.py

#!/usr/bin/env python 
# encoding: utf-8 

import sys 
sys.path.append('.') 


from pymongo_hadoop import BSONReducer 


def reducer(key, values): 
    print >> sys.stderr, "Processing key %s" % key 
    _count = 0 
    for v in values: 
     _count += v["count"] 

    return {"_id": key, "count": _count} 

BSONReducer(reducer) 

我運行了Cloudera的Hadoop CDH3u3上OSX。 Java示例沒有問題

UPDATE

我試圖0.23.1,並得到了同樣的錯誤工作。

運行-debug不會刪除PackagedJobJar streamjob.jar

當我解壓,mapper.pyreducer.py都在那裏

這些文件也有運行STD流作業時。蒙戈-haddoop-流仍然生成上述錯誤

回答

0

ķ得到它的工作 應該-files -file代替

[http://hadoop.apache.org/common/docs/r0.20.0/ api/org/apache/hadoop/util/GenericOptionsParser.html] [1]

hadoop jar /Volumes/Locodrive/hadoop/mongo-hadoop/streaming/target/mongo-hadoop-streaming-assembly-1.1.0-SNAPSHOT.jar -files mapper.py,reducer.py -inputURI mongodb://127.0.0.1:27017/mongo_hadoop.yield_historical.in -outputURI mongodb://127.0.0.1:27017/mongo_hadoop.testhadoop -mapper mapper.py -reducer reducer.py -verbose -debug