0
雖然我可以成功利用火花外殼連接到HBase的它引發錯誤Spark1.4.0無法連接到HBase的1.1.0.1
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.util.Addressing.getIpAddress()Ljava/net/InetAddress;
。誰能知道問題在哪裏?
詳細錯誤
15/07/01 18:57:57 ERROR yarn.ApplicationMaster: User class threw exception: java.io.IOException: java.lang.reflect.InvocationTargetException
java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
at com.koudai.resys.tmp.HbaseLearning$.main(HbaseLearning.scala:22)
at com.koudai.resys.tmp.HbaseLearning.main(HbaseLearning.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:483)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
... 9 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.util.Addressing.getIpAddress()Ljava/net/InetAddress;
at org.apache.hadoop.hbase.client.ClientIdGenerator.getIpAddressBytes(ClientIdGenerator.java:83)
at org.apache.hadoop.hbase.client.ClientIdGenerator.generateClientId(ClientIdGenerator.java:43)
at org.apache.hadoop.hbase.client.PerClientRandomNonceGenerator.<init>(PerClientRandomNonceGenerator.java:37)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:682)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
... 14 more
的SBT配置:
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.4.0" % "provided"
libraryDependencies += "org.apache.hbase" % "hbase" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-hadoop2-compat" % "1.1.0.1"
的運行的代碼:
val sc = new SparkConf().setAppName("[email protected]") val conf = HBaseConfiguration.create() conf.set("hbase.zookeeper.property.clientPort", "2181") conf.set("hbase.zookeeper.quorum", "idc02-rs-sfa-10") // the error raised from here val conn = ConnectionFactory.createConnection(conf)
使用反射來列出org.apache.hadoop.hbase方法.util.Addressing發現它是hbase 0.94版本,可能來自哪裏?
parsePort
createHostAndPortStr
createInetSocketAddressFromHostAndPortStr
getIpAddress
getIp4Address
getIp6Address
parseHostname
isLocalAddress
wait
wait
wait
equals
toString
hashCode
getClass
notify
notifyAll
您只需要在您的應用程序中使用hbase-client。你爲什麼在你的應用中包含hbase-server,hbase-common?你使用的是cdh還是hdp?哪個版本的發行版?另外,運行時應用程序的類路徑是什麼?在那裏引用的任何陳舊的安裝hbase的庫? –
謝謝,1。 hbase-common也需要通過測試2.使用Apache原始版本2.5.2,3. classpath已經包含hadoop lib 4.你是對的,通過挖掘classpath,發現有人意外地把舊版本的hbase lib(0.94) hadoop庫,它污染了classpath – Vanjor