2012-12-27 54 views
3

裝載後(使用rhive.init())我得到rhive.connect以下錯誤()初始化RHive:rhive.connect()與MAPR分配問題

java.lang.UnsatisfiedLinkError: no MapRClient in java.library.path 
     at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734) 
     at java.lang.Runtime.loadLibrary0(Runtime.java:823) 
     at java.lang.System.loadLibrary(System.java:1028) 
     at com.mapr.fs.MapRFileSystem.<clinit>(MapRFileSystem.java:1298) 
     at java.lang.Class.forName0(Native Method) 
     at java.lang.Class.forName(Class.java:247) 
     at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1028) 
     at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1079) 
     at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1413) 
     at org.apache.hadoop.fs.FileSystem.access$100(FileSystem.java:69) 
     at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1453) 
     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1435) 
     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:232) 
     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:115) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) 
     at java.lang.reflect.Method.invoke(Method.java:597) 
     at RJavaTools.invokeMethod(RJavaTools.java:386) Unable to load libMapRClient.so native library Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, : java.lang.UnsatisfiedLinkError: no MapRClient in java.library.path 

我得到同樣的錯誤時,我使用hive服務器IP地址運行rhive.connect(「10.2.138.168」)。我的環境變量設置爲:

MAHOUT_HOME=/opt/mapr/mahout/mahout-0.7 
JAVA_HOME=/usr/java/default 
HADOOP_HOME=/opt/mapr/hadoop/hadoop-0.20.2 
HADOOP_CONF_DIR=/opt/mapr/hadoop/hadoop-0.20.2/conf 
HIVE_HOME=/opt/mapr/hive/hive-0.9.0 

如果我運行rhive.env(),我得到以下警告/錯誤:

Hive Home Directory : /opt/mapr/hive/hive-0.9.0 
Hadoop Home Directory : /opt/mapr/hadoop/hadoop-0.20.2 
Hadoop Conf Directory : /opt/mapr/hadoop/hadoop-0.20.2/conf 
Default RServe List 
################################# IMPORTANT ############################################# ############################### /\/\/\/\/\/\/\ ########################################## # Use of slaves and masters file to start/stop jobtracker/tasktracker is not supported. # Please use maprcli: # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker start # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker start # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker stop # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker stop #########################################################################################warning: cant't connect to a Rserver at ################################# IMPORTANT #############################################:6311warning: cant't connect to a Rserver at ############################### /\/\/\/\/\/\/\ ##########################################:6311warning: cant't connect to a Rserver at # Use of slaves and masters file to start/stop jobtracker/tasktracker is not supported.:6311warning: cant't connect to a Rserver at # Please use maprcli::6311warning: cant't connect to a Rserver at # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker start:6311warning: cant't connect to a Rserver at # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker start:6311warning: cant't connect to a Rserver at # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker stop:6311warning: cant't connect to a Rserver at # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker stop:6311warning: cant't connect to a Rserver at #########################################################################################:6311 
Disconnected HiveServer and HDFS 
Warning messages: 
1: In socketConnection(host, port, open = "a+b", blocking = TRUE) : 
    ################################# IMPORTANT #############################################:6311 cannot be opened 
2: In socketConnection(host, port, open = "a+b", blocking = TRUE) : 
    ############################### /\/\/\/\/\/\/\ ##########################################:6311 cannot be opened 
3: In socketConnection(host, port, open = "a+b", blocking = TRUE) : 
    # Use of slaves and masters file to start/stop jobtracker/tasktracker is not supported.:6311 cannot be opened 
4: In socketConnection(host, port, open = "a+b", blocking = TRUE) : 
    # Please use maprcli::6311 cannot be opened 
5: In socketConnection(host, port, open = "a+b", blocking = TRUE) : 
    # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker start:6311 cannot be opened 
6: In socketConnection(host, port, open = "a+b", blocking = TRUE) : 
    # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker start:6311 cannot be opened 
7: In socketConnection(host, port, open = "a+b", blocking = TRUE) : 
    # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker stop:6311 cannot be opened 
8: In socketConnection(host, port, open = "a+b", blocking = TRUE) : 
    # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker stop:6311 cannot be opened 
9: In socketConnection(host, port, open = "a+b", blocking = TRUE) : 
    #########################################################################################:6311 cannot be opened 

編輯:

然後我設置:

現在
export LD_LIBRARY_PATH=/opt/mapr/hadoop/hadoop-0.20.2/lib/native/Linux-amd64-64/ 

,rhive.connect()返回以下錯誤:

2012-12-27 17:09:23,1578 ERROR Client fs/client/fileclient/cc/client.cc:676 Thread: 139849490286464 Unlink failed for file rhive_udf.jar, error Permission denied(13) 
2012-12-27 17:09:23,1578 ERROR JniCommon fs/client/fileclient/cc/jni_common.cc:1219 Thread: 139849490286464 remove: File /rhive/lib/rhive_udf.jar, rpc error, Permission denied(13) 
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, : 
    java.io.IOException: Target hdfs:/rhive/lib/rhive_udf.jar already exists 
SLF4J: Class path contains multiple SLF4J bindings. 
SLF4J: Found binding in [jar:file:/opt/mapr/hive/hive-0.9.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: Found binding in [jar:file:/opt/mapr/hadoop/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, : 
    org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused 
NULL 

任何想法,我要去哪裏錯了?謝謝!

回答

3

Java找不到dll庫。您必須設置的java.library.path屬性指向要加載的dll,開始你的application.` 時嘗試是這樣的

java -Djava.library.path=/opt/mapr/hadoop/hadoop-0.20.2/lib/native/ 
+0

謝謝,這工作;我的意思是我現在正在得到一個不同的錯誤。 –

+0

你有什麼本機文件夾..你必須有另一個文件夾依賴於你的操作系統和archi .. – agstudy