我已經創建在主機和從機節點一個Hadoop多節點集羣並且也配置SSH Hadoop的多節點集羣現在我可以連接在主節點無法啓動start-dfs.sh在
但是,如果沒有密碼到從當我嘗試在我無法連接到從屬節點執行主節點start-dfs.sh停在下面一行
日誌:
[email protected]:~$ start-all.sh
starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-HNname-namenode-master.out
[email protected]'s password: master: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-HNname-datanode-master.out
我按下回車
slave: Connection closed by 192.168.0.2
master: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-HNname-secondarynamenode-master.out
jobtracker running as process 10396. Stop it first.
[email protected]'s password: master: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-HNname-tasktracker-master.out
slave: Permission denied, please try again.
[email protected]'s password:
進入從密碼連接被關閉
下面的事情我都試過之後,但沒有結果:在這兩個主&從節點
- 格式化的NameNode節點
- 覆蓋默認的HADOOP_LOG_DIR形式this後
是的,這應該修復issu,因爲很明顯無密碼的ssh設置不正確 –