2017-06-10 121 views
1

我的HIVE Metastore版本是2.1.0。但是當我啓動我的Spark-shell時,它將版本更新爲1.2.0。如何阻止SPARK設置新的Hive版本

17/06/11 12:04:03 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/root/spark-2.1.1-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/root/spark/jars/datanucleus-core-3.2.10.jar." 
17/06/11 12:04:07 ERROR metastore.ObjectStore: Version information found in metastore differs 2.1.0 from expected schema version 1.2.0. Schema verififcation is disabled hive.metastore.schema.verification so setting version. 
17/06/11 12:04:09 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException 

這會導致我的HIVE停止工作。 我試圖設置spark.sql.hive.metastore.version 2.1.0spark-defaults.conf ....然後我的spark-shell不工作。 請幫我這個

回答

0

你應該能夠通過更新你的蜂房的site.xml禁用版本驗證

<name>hive.metastore.schema.verification</name> 
    <!-- <value>true</value> --> 
    <value>false</value> 
    <description> 
     Enforce metastore schema version consistency. 
     True: Verify that version information stored in is compatible with one from Hive jars. Also disable automatic 
      schema migration attempt. Users are required to manually migrate schema after Hive upgrade which ensures 
      proper metastore schema migration. (Default) 
     False: Warn if the version information stored in metastore doesn't match with one from in Hive jars. 
    </description> 
    </property> 
    <property> 
+0

嘿,感謝您的回答。不過,我發現了一個解決方案問題,它是用Spark的一個新補丁解決的。 –