2017-06-21 66 views
0

我知道這是一個跨互聯網的着名問題,並有大量的網站顯示如何解決這個問題,並且地球上也有很多QA。但沒有他們幫助我,我現在感到沮喪。所以我會盡我所能詳細說明,如果我忽略了任何內容,我感謝您的幫助。Hadoop:無法加載本機hadoop庫

**OS** : Ubuntu 16.04 32 bit 
**Hadoop version** : Hadoop 3.0.0-alpha3 
**bashrc** : 

export HADOOP_HOME=/usr/local/hadoopec/hadoop 
export HADOOP_CONF_DIR=/usr/local/hadoopec/hadoop/etc/hadoop 
export HADOOP_MAPRED_HOME=/usr/local/hadoopec/hadoop 
export HADOOP_COMMON_HOME=/usr/local/hadoopec/hadoop 
export HADOOP_HDFS_HOME=/usr/local/hadoopec/hadoop 
export PATH=$PATH:/usr/local/hadoopec/hadoop/bin 

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-i386 

export LD_LIBRARY_PATH=/usr/local/hadoopec/hadoop/lib/native/:$LD_LIBRARY_PATH 
#export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native" 

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native 
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" 

**core-site.xml** : 

<property> 
    <name>hadoop.tmp.dir</name> 
    <value>/usr/hadoopec/hadoop/tmp</value> 
    <description>Temporary Directory.</description> 
</property> 

<property> 
    <name>fs.default.name</name> 
    <value>hdfs://localhost:9000</value> 
    <description>Use HDFS as file storage engine</description> 
</property> 

**hdfs-env.sh** 

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native" 
export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true" 
export LD_LIBRARY_PATH=/usr/local/hadoopec/hadoop/lib/native/:$LD_LIBRARY_PATH 

**hdfs-site.xml** 

<property> 
<name>dfs.replication</name> 
<value>3</value> 
</property> 

<property> 
<name>dfs.permission</name> 
<value>false</value> 
</property> 

<property> 
    <name>dfs.namenode.name.dir</name> 
    <value>/usr/local/hadoopec/hadoop/tmp/hdfs/name</value> 
    </property> 

<property> 
    <name>dfs.datanode.data.dir</name> 
    <value>/usr/local/hadoopec/hadoop/tmp/hdfs/data</value> 
    </property> 

如果有人需要更多的信息,請隨時問我。

回答

0

我解決了通過下載hadoop 2.8.0和做所有確切的配置,因爲我與以前的hadoop,雖然我不知道究竟是什麼導致錯誤或警告? localhost:50070給我unable to connect錯誤。如果有人能指出原因,我會很感激。