我明白這個問題可能已經被回答,以及加載主類,我的問題還在這裏:找不到或org.apache.hadoop.fs.FsShell
我有一個虛擬機Hadoop的建立在VMware使用CentOS7,我就可以開始NameNode和DataNode會,但是,當我嘗試使用下面的命令來查看HDFS文件:
hdfs dfs -ls
它拋出如下的錯誤:
Could not find or load main class org.apache.hadoop.fs.FsShell
我的谷歌Searchin的GS認爲這可能涉及到Hadoop的變量在bash設置,這裏是我的設置:
# .bashrc
# Source global definitions
if [ -f /etc/bashrc ]; then
. /etc/bashrc
fi
export HADOOP_HOME=/opt/hadoop/hadoop-2.7.2
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_PREFIX=$HADOOP_HOME
export HIVE_HOME=/opt/hadoop/hive
export PATH=$HIVE_HOME/bin:$PATH
export ANT_HOME=/usr/local/apache-ant-1.9.7
export PATH=${PATH}:${JAVA_HOME}/bin
export PIG_HOME=/opt/hadoop/pig-0.15.0
export PIG_HADOOP_VERSION=0.15.0
export PIG_CLASSPATH=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$PIG_HOME/bin
export PATH=$PATH:$HADOOP_HOME/bin
export HADOOP_USER_CLASSPATH_FIRST=true
export SQOOP_HOME=/usr/lib/sqoop
export PATH=$PATH:$SQOOP_HOME/bin
export HADOOP_CLASSPATH=$HADOOP_HOME/share/hadoop/common/
export PATH=$PATH:$HADOOP_CLASSPATH
# Uncomment the following line if you don't like systemctl's auto-paging feature
:
# export SYSTEMD_PAGER=
# User specific aliases and functions
我檢查了我的Hadoop文件夾:/opt/hadoop/hadoop-2.7.2/share/hadoop/common,這裏是列表:
我正在使用root帳戶進行此練習,任何人都可以幫助找出問題的原因並解決它嗎?非常感謝你。
你可以做'回聲$ {HADOOP_CLASSPATH}'好嗎? – SMA
謝謝。 /opt/hadoop/hadoop-2.7.2/share/hadoop/common/ – mdivk
嘗試運行'export HADOOP_CLASSPATH = $ HADOOP_CLASSPATH:/opt/hadoop/hadoop-2.7.2/share/hadoop/common/hadoop-common-2.7。 2.jar' – SMA