當我們嘗試將數據插入Hive表時,存在以下問題。嘗試將數據保存到數據框的Hive表時出錯
作業已中止由於階段失敗:在階段65.0(TID 987, tnblf585.test.sprint.com)丟失任務5.3:任務5階段65.0失敗4次 ,最近失敗的java.lang .ArrayIndexOutOfBoundsException: 45在 org.apache.spark.sql.catalyst.expressions.GenericMutableRow.genericGet(rows.scala:254) 在 org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow $ class.getAs( rows.scala:35) at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow $ class.isNullAt(rows.scala:36) at org.apache .spark.sql.catalyst.expressions.GenericMutableRow.isNullAt(rows.scala:248) at org.apache.spark.sql.hive.execution.InsertIntoHiveTable $$ anonfun $ org $ apache $ spark $ sql $ hive $ execution $ InsertIntoHiveTable $$ writeToFile $ 1 $ 1.apply(InsertIntoHiveTable.scala:107) at org.apache.spark.sql.hive.execution.InsertIntoHiveTable $$ anonfun $ org $ apache $ spark $ sql $ hive $ execution $ InsertIntoHiveTable $ $ writeToFile $ 1 $ 1.apply(InsertIntoHiveTable.scala:104) at scala.collection.Iterator $ class.foreach(Iterator.scala:727)at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)at org .apache.spark.sql.hive.execution.InsertIntoHiveTable.org $ apache $ spark $ sql $ hive $ execution $ InsertIntoHiveTable $$ writeToFile $ 1(InsertIntoHiveTable.scala:104) at org.apache.spark.sql.hive.execution.InsertIntoHiveTable $$ anonfun $ saveAsHiveFile $ 3.apply(InsertIntoHiveTable.scala:84) at org.apache.spark.sql.hive.execution.InsertIntoHiveTable $$ anonfun $ saveAsHiveFile $ 3.apply(InsertIntoHiveTable.scala:84) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89)在 org.apache.spark.executor.Executor $ TaskRunner.run(Executor.scala:227) 在 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 在 java.util.concurrent中。 ThreadPoolExecutor $ Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)
驅動程序堆棧跟蹤:
目前還不清楚什麼,你如何試圖插入配置單元表。在日誌中拋出一個arrayoutofboundsexception。它看起來你的數據可能不一致。檢查你的數據。 – Kris
你不要求任何具體的東西,你不提供源代碼,你不提供示例數據,你不提供目標表結構,你不提Spark/Hive版本。你真的希望得到答案?!? –