我是Accumulo的新手,嘗試在Cloudera VM上安裝v1.7。Accumulo init - [start.Main]錯誤:初始化類加載器
我有Java 1.7和HDP 2.2,Zookeeper目前正在運行。我主要是在試圖跟隨INSTALL.md無事,並已經配置但是Accumulo在嘗試初始化時出現以下錯誤:
./bin/accumulo init
2016-02-23 09:24:07,999 [start.Main] ERROR: Problem initializing the class loade r
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.accumulo.start.Main.getClassLoader(Main.java:68)
at org.apache.accumulo.start.Main.main(Main.java:52)
Caused by: java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.commons.vfs2.impl.DefaultFileSystemManager.<init>(DefaultF ileSystemManager.java:120)
at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.gene rateVfs(AccumuloVFSClassLoader.java:246)
at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.getC lassLoader(AccumuloVFSClassLoader.java:204)
... 6 more
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFacto ry
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass (AccumuloClassLoader.java:281)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 9 more
Exception in thread "Thread-0" java.lang.NoClassDefFoundError: org/apache/common s/io/FileUtils
at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.clos e(AccumuloVFSClassLoader.java:406)
at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader$Accu muloVFSClassLoaderShutdownThread.run(AccumuloVFSClassLoader.java:74)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.io.FileUtils
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass (AccumuloClassLoader.java:281)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 3 more
我讀過這個地方已經放下一個壞設置其他職位然而accumulo-env.sh
下面我看不出有什麼我在想念
if [[ -z $HADOOP_HOME ]] ; then
test -z "$HADOOP_PREFIX" && export HADOOP_PREFIX=/usr/lib/hadoop
else
HADOOP_PREFIX="$HADOOP_HOME"
unset HADOOP_HOME
fi
# hadoop-2.0:
test -z "$HADOOP_CONF_DIR" && export HADOOP_CONF_DIR="$HADOOP_PREFIX/etc/hadoop"
test -z "$ACCUMULO_HOME" && export ACCUMULO_HOME="/etc/accumulo/accumulo-1.7.0"
test -z "$JAVA_HOME" && export JAVA_HOME="/usr/java/jdk1.7.0_67-cloudera"
test -z "$ZOOKEEPER_HOME" && export ZOOKEEPER_HOME=/usr/lib/zookeeper
test -z "$ACCUMULO_LOG_DIR" && export ACCUMULO_LOG_DIR=$ACCUMULO_HOME/logs
if [[ -f ${ACCUMULO_CONF_DIR}/accumulo.policy ]]
then
POLICY="-Djava.security.manager -Djava.security.policy=${ACCUMULO_CONF_DIR}/accumulo.policy"
fi
而且我已經在我的general.classpaths以下
<property>
<name>general.classpaths</name>
<value>
<!-- Accumulo requirements -->
$ACCUMULO_HOME/lib/accumulo-server.jar,
$ACCUMULO_HOME/lib/accumulo-core.jar,
$ACCUMULO_HOME/lib/accumulo-start.jar,
$ACCUMULO_HOME/lib/accumulo-fate.jar,
$ACCUMULO_HOME/lib/accumulo-proxy.jar,
$ACCUMULO_HOME/lib/[^.].*.jar,
<!-- ZooKeeper requirements -->
$ZOOKEEPER_HOME/zookeeper[^.].*.jar,
<!-- Common Hadoop requirements -->
$HADOOP_CONF_DIR,
<!-- Hadoop 2 requirements --><!--
$HADOOP_PREFIX/share/hadoop/common/[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/common/lib/(?!slf4j)[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/hdfs/[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/mapreduce/[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/yarn/[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/yarn/lib/jersey.*.jar,
--><!-- End Hadoop 2 requirements -->
<!-- HDP 2.0 requirements --><!--
/usr/lib/hadoop/[^.].*.jar,
/usr/lib/hadoop/lib/[^.].*.jar,
/usr/lib/hadoop-hdfs/[^.].*.jar,
/usr/lib/hadoop-mapreduce/[^.].*.jar,
/usr/lib/hadoop-yarn/[^.].*.jar,
/usr/lib/hadoop-yarn/lib/jersey.*.jar,
--><!-- End HDP 2.0 requirements -->
<!-- HDP 2.2 requirements -->
/usr/hdp/current/hadoop-client/[^.].*.jar,
/usr/hdp/current/hadoop-client/lib/(?!slf4j)[^.].*.jar,
/usr/hdp/current/hadoop-hdfs-client/[^.].*.jar,
/usr/hdp/current/hadoop-mapreduce-client/[^.].*.jar,
/usr/hdp/current/hadoop-yarn-client/[^.].*.jar,
/usr/hdp/current/hadoop-yarn-client/lib/jersey.*.jar,
/usr/hdp/current/hive-client/lib/hive-accumulo-handler.jar
/usr/lib/hadoop/lib/commons-io-2.4.jar
<!-- End HDP 2.2 requirements -->
</value>
<description>Classpaths that accumulo checks for updates and class files.</description>
任何幫助將不勝感激,有趣的是,我試圖運行時得到相同的結果./bin/accumulo classpath
謝謝elserj,我已經編輯了我的general.classpath以包含此內容(請參閱上文),但仍然遇到同樣的問題。根據我原來的帖子(最後一行),我無法檢查'accumulo classpath'的輸出,因爲得到相同的錯誤 – jhole89
糟糕,抱歉。我錯過了關於無法運行classpath cmd的最後一點:)。你可以嘗試'find -L/usr/lib/hadoop -name'commons-io-2.4.jar'',並將其與general.classpaths中的正則表達式進行比較。 – elserj