2016-10-03 33 views
0

我發現了錯誤,而在運行start-dfs.sh運行start-dfs.sh

start-dfs.sh 
16/10/02 23:10:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
Starting namenodes on [localhost] 
localhost: starting namenode, logging to /opt/hadoop/logs/hadoop-root-namenode-Web.out 
localhost: nice: /home/hadoop/hadoop/bin/hdfs: No such file or directory 
localhost: starting datanode, logging to /opt/hadoop/logs/hadoop-root-datanode-Web.out 
localhost: nice: /home/hadoop/hadoop/bin/hdfs: No such file or directory 
Starting secondary namenodes [0.0.0.0] 
0.0.0.0: starting secondarynamenode, logging to /opt/hadoop/logs/hadoop-root-secondarynamenode-Web.out 
0.0.0.0: nice: /home/hadoop/hadoop/bin/hdfs: No such file or directory 
16/10/02 23:11:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
+0

文件'hdfs'是否存在於'/ home/hadoop/hadoop/bin /'目錄中? – maxteneff

+0

是目錄/ opt/hadoop/bin/hdfs – Sushil

回答

0

看起來你缺少Hadoop的家的環境變量我收到提示。

export HADOOP_HOME=/opt/hadoop 

然後嘗試是什麼,工作

hadoop version 

問題像你要看缺乏的環境變量的。