2015-09-26 14 views
1

我在OS X 10.10.5上運行hadoop 2.6.1。Hadoop 2.6.1警告:WARN util.NativeCodeLoader

我讀過這個問題可能是無法加載原生的Hadoop庫平臺...使用內置的Java類適用: 我得到這樣的警告:

WARN util.NativeCodeLoader由運行32位本地庫libhadoop.so.1.0.0與64位版本的hadoop引起。我檢查了我的libhadoop.so.1.0.0版本,它是64位的。

$ find ~/hadoop-2.6.1/ -name libhadoop.so.1.0.0 -ls 
136889669  1576 -rwxr-xr-x 1 davidlaxer  staff    806303 Sep 16 14:18 /Users/davidlaxer/hadoop-2.6.1//lib/native/libhadoop.so.1.0.0 

$ file /Users/davidlaxer/hadoop-2.6.1//lib/native/libhadoop.so.1.0.0 
    /Users/davidlaxer/hadoop-2.6.1//lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped 

$ env | grep HADOOP 
HADOOP_HOME=/Users/davidlaxer/hadoop-2.6.1 
HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop-2.6.1/lib/native 
HADOOP_INSTALL=/Users/davidlaxer/hadoop-2.6.1 
HADOOP_CONF_DIR=/Users/davidlaxer/hadoop-2.6.1/etc/hadoop 
HADOOP_OPTS=-Djava.library.path=/Users/davidlaxer/hadoop-2.6.1/lib 

$ hadoop version 
Hadoop 2.6.1 
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r b4d876d837b830405ccdb6af94742f99d49f9c04 
Compiled by jenkins on 2015-09-16T21:07Z 
Compiled with protoc 2.5.0 
From source with checksum ba9a9397365e3ec2f1b3691b52627f 
This command was run using /Users/davidlaxer/hadoop-2.6.1/share/hadoop/common/hadoop-common-2.6.1.jar 

$ java -version 
java version "1.8.0_05" 
Java(TM) SE Runtime Environment (build 1.8.0_05-b13) 
Java HotSpot(TM) 64-Bit Server VM (build 25.5-b02, mixed mode) 

$ hadoop checknative -a 
15/09/26 11:01:28 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
Native library checking: 
hadoop: false 
zlib: false 
snappy: false 
lz4:  false 
bzip2: false 
openssl: false 
15/09/26 11:01:29 INFO util.ExitUtil: Exiting with status 1 


$ sudo port list zlib 
Password: 
zlib       @1.2.8   archivers/zlib 
$ sudo port list snappy 
snappy       @1.1.1   archivers/snappy 
$ sudo port list lz4 
lz4       @r130   archivers/lz4 
$ sudo port list bzip2 
bzip2       @1.0.6   archivers/bzip2 
$ sudo port list openssl 
openssl      @1.0.2d   devel/openssl 

$env | grep CLASS 

CLASSPATH=/users/davidlaxer/trunk/core/src/test/java/:/Users/davidlaxer/hadoop-2.6.1-src/hadoop-dist/target/hadoop-dist-2.6.1.jar:/Users/davidlaxer/clojure/target:/Users/davidlaxer/hadoop-2.6.1/lib/native: 

˚任何想法?

回答

0

警告的一個原因可能是處理器體系結構衝突(32 vs 64位)。另一種可能是本機庫不在java庫路徑上(或者它所需的庫不在路徑上)。

執行下面的應該給你更詳細的信息:

hadoop checknative -a 

也不要你的HADOOP_OPTS實際卻指向lib目錄,而不是LIB /本土?

也許你應該添加以下到您的log4j.properties文件以獲得更多信息: log4j.logger.org.apache.hadoop.util.NativeCodeLoader = DEBUG

也有一些其他的東西,你可以嘗試:

1)

suda hadoop checknative -a 

(以檢查它是否是一個權限問題)

2)設置LD_LIBRARY另外還有_PATH。

+0

$ Hadoop的checknative -a 15/09/26 11時01分28秒WARN util.NativeCodeLoader:無法爲您的平臺加載native-hadoop庫...使用內置的Java類適用 機庫檢查: Hadoop的:假 zlib:和假 瞬間:假 LZ4:假 的bzip2:假 的OpenSSL:假 15/09/26 11時01分29秒INFO UTIL。 ExitUtil:退出狀態1 – dbl001

+0

env | grep CLASSSPATH =/users/davidlaxer/trunk/core/src/test/java /:/ Users/davidlaxer/hadoop-2.6.1-src/hadoop-dist/target/hadoop-dist-2.6.1.jar: /Users/davidlaxer/clojure/target:/Users/davidlaxer/hadoop-2.6.1/lib/native: – dbl001

+0

實際上,從checknative的輸出看來,您似乎需要安裝更多的庫。 – benohead

0

我建的hadoop-3.0.0-SNAPSHOT從源代碼與本地代碼支持, 調整環境變量HADOOP_COMMON_LIB_NATIVE_DIR:

​​

現在的Hadoop的兩個版本(如2.3.0和3.0快照)得到沒有警告:

HADOOP_HOME=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT 
HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native 
HADOOP_INSTALL=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT 
HADOOP_CONF_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/etc/hadoop 
HADOOP_OPTS=-Djava.library.path=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native 


$ hadoop version 
Hadoop 2.3.0 
Subversion http://svn.apache.org/repos/asf/hadoop/common -r 1567123 
Compiled by jenkins on 2014-02-11T13:40Z 
Compiled with protoc 2.5.0 
From source with checksum dfe46336fbc6a044bc124392ec06b85 
This command was run using /Users/davidlaxer/hadoop-2.3.0/share/hadoop/common/hadoop-common-2.3.0.jar 

$ bin/hadoop version 
Hadoop 3.0.0-SNAPSHOT 
Source code repository https://github.com/apache/hadoop.git -r 83e65c5fe84819b6c6da015b269fb4e46a88d105 
Compiled by davidlaxer on 2015-09-26T22:46Z 
Compiled with protoc 2.5.0 
From source with checksum 883bd12016a9bbe21eb0ae4b6beaa 
This command was run using /Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/share/hadoop/common/hadoop-common-3.0.0-SNAPSHOT.jar 

$ bin/hadoop fs -ls /users/davidlaxer/genomics/reads/HG00103 
Found 1 items 
drwxr-xr-x - davidlaxer staff  102 2015-09-26 09:13 /users/davidlaxer/genomics/reads/HG00103/_temporary 
David-Laxers-MacBook-Pro:hadoop-3.0.0-SNAPSHOT davidlaxer$ bin/hadoop version 
Hadoop 3.0.0-SNAPSHOT 
Source code repository https://github.com/apache/hadoop.git -r 83e65c5fe84819b6c6da015b269fb4e46a88d105 
Compiled by davidlaxer on 2015-09-26T22:46Z 
Compiled with protoc 2.5.0 
From source with checksum 883bd12016a9bbe21eb0ae4b6beaa 
This command was run using /Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/share/hadoop/common/hadoop-common-3.0.0-SNAPSHOT.jar 
1

我有同樣的問題,並通過在作業配置中添加以下XML文件解決它。

Configuration conf = job.getConfiguration(); 

//To access HDFS files 
conf.addResource(new Path(hadoopHome+"/conf/core-site.xml"));//In my case hadoopHome is /usr/lib/hadoop 
conf.addResource(new Path(hadoopHome+"/conf/hdfs-site.xml")); 

conf.addResource(new Path(hadoopHome+"/conf/mapred-site.xml")); //To run a mapreduce job 

/* Other configurations comes here*/ 
1

我有同樣的問題。幸運的是,問題已經解決。 您可以在下面看到解決方案。

幾次顯示警告不會讓您上傳文件到HDFS。 所以當沒有警告時,就沒有這樣的問題。

打開hadoop-env.sh並轉到文件的末尾並嘗試添加以下行

export HADOOP_HOME_WARN_SUPPRESS=1 
export HADOOP_ROOT_LOGGER="WARN,DRFA"