0

我們使用bdutil 1.1來部署Spark(1.2.0)羣集。然而,當我們啓動我們的火花腳本時,我們遇到了一個問題:Google Compute Engine上的Spark SQL

py4j.protocol.Py4JJavaError: An error occurred while calling o70.registerTempTable. 
: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346) 
at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:235) 
at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:231) 
at scala.Option.orElse(Option.scala:257) 
at org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:231) 
at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229) 
at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:229) 
at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229) 
at org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCatalog.scala:54) 
at org.apache.spark.sql.hive.HiveContext$$anon$1.<init>(HiveContext.scala:253) 
at org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:253) 
at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:253) 
at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:72) 
at org.apache.spark.sql.SQLContext.registerRDDAsTable(SQLContext.scala:279) 
at org.apache.spark.sql.SchemaRDDLike$class.registerTempTable(SchemaRDDLike.scala:86) 
at org.apache.spark.sql.api.java.JavaSchemaRDD.registerTempTable(JavaSchemaRDD.scala:42) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:606) 
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231) 
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379) 
at py4j.Gateway.invoke(Gateway.java:259) 
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133) 
at py4j.commands.CallCommand.execute(CallCommand.java:79) 
at py4j.GatewayConnection.run(GatewayConnection.java:207) 
at java.lang.Thread.run(Thread.java:745) 
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient 
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412) 
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62) 
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72) 
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453) 
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465) 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340) 
... 26 more 
Caused by: java.lang.reflect.InvocationTargetException 
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410) 
... 31 more 
Caused by: javax.jdo.JDOFatalUserException: Class org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not found. 

該腳本適用於我的筆記本電腦。我在/ home/hadoop/spark-install/lib路徑中有datanucleus-api-jdo-3.2.6.jar。

任何想法可能是錯誤的?

+0

看來,'org.datanucleus.api.jdo.JDOPersistenceManagerFactory'是不是在classpath。 – 2015-03-13 13:38:32

+0

我試圖在我的spark-submit命令的末尾添加「--jars /home/hadoop/spark-install/lib/datanucleus-api-jdo-3.2.6.jar」,但它不起作用。 – poiuytrez 2015-03-13 13:44:31

回答

1

我需要添加:

SPARK_CLASSPATH=/home/hadoop/spark-install/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/spark-install/lib/datanucleus-core-3.2.10.jar:/home/hadoop/spark-install/lib/datanucleus-rdbms-3.2.9.jar 

我的火花提交命令之前:

SPARK_CLASSPATH=/home/hadoop/spark-install/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/spark-install/lib/datanucleus-core-3.2.10.jar:/home/hadoop/spark-install/lib/datanucleus-rdbms-3.2.9.jar ../hadoop/spark-install/bin/spark-submit main.py --master spark://spark-m:7077 
相關問題