2014-01-11 100 views
4

我用Hadoop 2.2.0工作,並試圖運行此hdfs_test.cpp應用:Hadoop的C++ HDFS測試運行異常

#include "hdfs.h" 

int main(int argc, char **argv) { 

    hdfsFS fs = hdfsConnect("default", 0); 
    const char* writePath = "/tmp/testfile.txt"; 
    hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0); 
    if(!writeFile) { 
      fprintf(stderr, "Failed to open %s for writing!\n", writePath); 
      exit(-1); 
    } 
    char* buffer = "Hello, World!"; 
    tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1); 
    if (hdfsFlush(fs, writeFile)) { 
      fprintf(stderr, "Failed to 'flush' %s\n", writePath); 
      exit(-1); 
    } 
    hdfsCloseFile(fs, writeFile); 
} 

我編譯,但是當我與運行它。/hdfs_test我有此:

loadFileSystems error: 
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.) 
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error: 
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.) 
hdfsOpenFile(/tmp/testfile.txt): constructNewObjectOfPath error: 
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.) 
Failed to open /tmp/testfile.txt for writing! 

也許是類路徑的問題。 我的$ HADOOP_HOME爲/ usr /本地/ Hadoop的,這是我真正變量* CLASSPATH *

echo $CLASSPATH 
/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/yarn/lib/*:/usr/local/hadoop/share/hadoop/yarn/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar 

任何幫助表示讚賞..謝謝

回答

3

我所面臨的問題與使用使用基於JNI的程序時,類路徑中的通配符。嘗試使用直接jar-in-classpath的方法,例如我在https://github.com/QwertyManiac/cdh4-libhdfs-example/blob/master/exec.sh#L3這個示例代碼中生成的方法,我相信它應該工作。整個包含的示例在https://github.com/QwertyManiac/cdh4-libhdfs-example目前確實有效。也

https://stackoverflow.com/a/9322747/1660002

+0

結果添加到CLASSPATH可變我不得不通過以下柺杖解決同樣的問題:'出口CLASSPATH = $ (對於'hadoop classpath -glob | sed s'/://g''中的p,找到$ p -name'* .jar'2>/dev/null; done | tr'\ n'':' )' –

4

試試這個:

hadoop classpath --glob 

然後在~/.bashrc

相關問題