2014-10-10 32 views
0

我看到這個問題多次提到,我已經跟着使用Hadoop 2.4.1和水槽1.5.0.1張貼在herejava.lang.NoSuchFieldError的:IBM_JAVA

上午。我flume-env.sh配置如下圖

FLUME_CLASSPATH="/var/lib/apache-flume-ng:lib/hadoop-core-1.2.0.jar:lib/hadoop-auth-2.4.1.jar:lib/hadoop-yarn-api-2.4.1.jar:lib/hadoop-mapreduce-client-jobclient-2.4.1.jar:lib/hadoop-mapreduce-client-core-2.4.1.jar:lib/hadoop-common-2.4.1.jar:lib/hadoop-annotations-2.4.1.jar" 

這些罐子我已經增加了一個罐子是公共配置-1.6.jar 在水槽的LIB可用。我是Flume和hadoop的新手。請幫助我。

完整的曲線如下:

ERROR [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:149) - Unhandled error 
java.lang.NoSuchFieldError: IBM_JAVA 
    at org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:337) 
    at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:382) 
    at org.apache.flume.sink.hdfs.HDFSEventSink.authenticate(HDFSEventSink.java:553) 
    at org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:272) 
    at org.apache.flume.conf.Configurables.configure(Configurables.java:41) 
    at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:418) 
    at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:103) 
    at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) 
    at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304) 
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178) 
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 

回答

-1

終於讓我找到答案了這一點。

  1. 從hadoop-auth複製PlatformName類並在本地自定義編譯它。

    package org.apache.hadoop.util;

    public class PlatformName { 
    
        private static final String platformName = System.getProperty("os.name") + "-" + System.getProperty("os.arch") + "-" + System.getProperty("sun.arch.data.model"); 
    
        public static final String JAVA_VENDOR_NAME = System.getProperty("java.vendor"); 
    
        public static final boolean IBM_JAVA = JAVA_VENDOR_NAME.contains("IBM"); 
    
        public static String getPlatformName() { 
         return platformName; 
        } 
        public static void main(String[] args) { 
         System.out.println(platformName); 
        } 
    } 
    
  2. 將類文件複製粘貼到hadoop-core中。你應該開始運行。

感謝所有。

+0

解決方案是添加$ HADOOP_HOME/share/all庫中找到的所有jar。由此你不必添加任何自定義代碼 – Raghuveer 2014-10-13 07:36:40

3

問題是由丟失或依賴性衝突而引起的:

  1. 加入Hadoop的身份驗證到classpath
  2. 如果問題仍然出現,從您的類路徑中刪除Hadoop的核心。它與hadoop-auth相沖突。

這會解決問題。

+0

是的,這解決了我的問題。我必須添加hadoop-auth並刪除hadoop-core。 – edge 2016-12-25 14:50:45

相關問題