1
我的HIVE Metastore版本是2.1.0。但是當我啓動我的Spark-shell時,它將版本更新爲1.2.0。如何阻止SPARK設置新的Hive版本
17/06/11 12:04:03 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/root/spark-2.1.1-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/root/spark/jars/datanucleus-core-3.2.10.jar."
17/06/11 12:04:07 ERROR metastore.ObjectStore: Version information found in metastore differs 2.1.0 from expected schema version 1.2.0. Schema verififcation is disabled hive.metastore.schema.verification so setting version.
17/06/11 12:04:09 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
這會導致我的HIVE停止工作。 我試圖設置spark.sql.hive.metastore.version 2.1.0在spark-defaults.conf ....然後我的spark-shell不工作。 請幫我這個
嘿,感謝您的回答。不過,我發現了一個解決方案問題,它是用Spark的一個新補丁解決的。 –