6
背景從蜂巢交易啓用表結果: -無法擷取通過火花SQL
- 我使用HDP與spark1.6.0和分羣1.2.1
隨後的步驟: -
創建一個蜂巢表: -
個hive>
CREATE TABLE orctest(PROD_ID bigint, CUST_ID bigint, TIME_ID timestamp, CHANNEL_ID bigint, PROMO_ID bigint, QUANTITY_SOLD decimal(10,0), AMOUNT_SOLD decimal(10,0)) CLUSTERED BY (PROD_ID) INTO 32 BUCKETS STORED AS ORC TBLPROPERTIES ("orc.compress"="SNAPPY", "transactional"="true");
記錄插入orctest
hive>
insert into orctest values(1, 1, '2016-08-02 21:36:54.000000000', 1, 1, 10, 10000);
嘗試從訪問orctest表火花外殼
scala>
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
val s = hiveContext.table("orctest")*
拋出異常: -
16/08/02 22:06:54 INFO OrcRelation: Listing hdfs://hadoop03:8020/apps/hive/warehouse/orctest on driver
16/08/02 22:06:54
INFO OrcRelation: Listing hdfs://hadoop03:8020/apps/hive/warehouse/orctest/delta_0000005_0000005 on driver
**java.lang.AssertionError: assertion failed**
at scala.Predef$.assert(Predef.scala:165)
at org.apache.spark.sql.execution.datasources.LogicalRelation$$anonfun$1.apply(LogicalRelation.scala:39)
at org.apache.spark.sql.execution.datasources.LogicalRelation$$anonfun$1.apply(LogicalRelation.scala:38)
at scala.Option.map(Option.scala:145)
at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:38)
at org.apache.spark.sql.execution.datasources.LogicalRelation.copy(LogicalRelation.scala:31)
at org.apache.spark.sql.hive.HiveMetastoreCatalog.org$apache$spark$sql$hive$HiveMetastoreCatalog$$convertToOrcRelation(HiveMetastoreCatalog.scala:588)
任何幫助將非常感激。
看一看http://stackoverflow.com/questions/27171702/error-in-scala-compiler-java-lang-assertionerror-assertion-failed-even-when-p – BruceWayne
感謝奎師那的評論。 但我沒有嘗試過這個與scala項目,我直接嘗試這在spark-shell。 我的想法: - 如果你創建一個跨國屬性爲真的配置單元表,那麼你不能夠通過火花訪問表格的內容(請糾正我,如果我錯了) PS:我正在使用HDP Spark 1.6.0和配置單元1.2.1) –
我也面臨類似的問題。我無法將啓用事務的表加載到火花數據框。你有沒有找到解決這個問題的方法?在嘗試加載之前,我嘗試使用sqlContext.setConf()設置配置單元事務屬性。我也嘗試在源交易表上創建一個視圖,並從火花點擊視圖,但無濟於事 –