0
我試圖執行命令hadoop dfs -ls
,我得到這個錯誤連續的錯誤:失敗的連接異常,然後com.google.protobuf.InvalidProtocolBufferException
Call From localhost/127.0.0.1 to yass-SATELLITE-C855-2CF:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
一旦我下定決心,我得到另一個是
ls: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message tag had invalid wire type.; Host Details : local host is: "localhost/127.0.0.1"; destination host is: "yass-SATELLITE-C855-2CF":9000;
,我不斷在循環這兩個錯誤之間
我core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://yass-SATELLITE-C855-2CF:9000</value>
</property>
</configuration>
hdfs-site.xml
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.safemode.threshold.pct</name>
<value>0</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/hadoop/data/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/hadoop/data/datanode</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>/home/yass/Téléchargements/hadoop/hdfs/name</value>
</property>
<property>
<name>dfs.datanode.use.datanode.hostname</name>
<value>false</value>
</property>
<property>
<name>dfs.namenode.datanode.registration.ip-hostname-check</name>
<value>false</value>
</property>
etc/hosts
127.0.0.1 localhost
127.0.0.1 yass-SATELLITE-C855-2CF
# The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
我得到了第一個例外,一旦它氣餒我得到了第二個,我在環始終與此兩個例外 任何建議嗎?
在現實