Hadoop 2.7安裝在/opt/pro/hadoop/hadoop-2.7.3
在主,然後整個安裝複製到奴隸,但不同的目錄/opt/pro/hadoop-2.7.3
。然後我在slave機器上更新環境變量(例如,HADOOP_HOME,hdfs_site.xml用於namenode和datanode)。hadoop安裝路徑應該是相同的跨節點
現在我可以在slave上成功運行hadoop version
。然而,在主,start-dfs.sh
失敗消息:
17/02/18 10:24:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /opt/pro/hadoop/hadoop-2.7.3/logs/hadoop-shijiex-namenode-shijie-ThinkPad-T410.out
master: starting datanode, logging to /opt/pro/hadoop/hadoop-2.7.3/logs/hadoop-shijiex-datanode-shijie-ThinkPad-T410.out
slave: bash: line 0: cd: /opt/pro/hadoop/hadoop-2.7.3: No such file or directory
slave: bash: /opt/pro/hadoop/hadoop-2.7.3/sbin/hadoop-daemon.sh: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /opt/pro/hadoop/hadoop-2.7.3/logs/hadoop-shijiex-secondarynamenode-shijie-ThinkPad-T410.out
17/02/18 10:26:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Hadoop的使用主(/opt/pro/hadoop/hadoop-2.7.3
)在從站的HADOOP_HOME
,而在HADOOP_HOME
從屬是/opt/pro/hadoop-2.7.3
。 安裝時,跨節點的HADOOP_HOME應該是相同的嗎?
的.bashrc
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export PATH=$PATH:/usr/lib/jvm/java-7-openjdk-amd64/bin
export HADOOP_HOME=/opt/pro/hadoop-2.7.3
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin
hadoop-env.sh
# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
在從服務器,$ HADOOP_HOME的/ etc/Hadoop的有文件大師:
[email protected]:/opt/pro/hadoop-2.7.3/etc/hadoop$ cat masters
master
我確認的環境變量在''.bashrc''中是正確的,並且我沒有在etc/hadoop/* xml中添加任何新變量。不確定它是否與從服務器上的主設備相關。無論如何,我強制安裝在兩臺服務器上保持一致,作爲當前的解決方案。 –