2016-01-14 27 views
1

我使用joda.time.Datetime庫轉換字符串日期時間字段,但它拋出異常不受支持這裏 字符串轉換爲DateTime是主類代碼:UnsupportedOperationException異常錯誤使用約達時間

//create new var with input data without header 
var inputDataWithoutHeader: RDD[String] = dropHeader(inputFile) 
var inputDF1 = inputDataWithoutHeader.map(_.split(",")).map{p => 
val dateYMD: DateTime = DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss").parseDateTime(p(8)) 
testData(dateYMD)}.toDF().show() 

P(8)是與類TESTDATA和CSV數據定義的列數據類型日期時間columnn具有類似於2013年2月17日00:00:00

這裏值爲TESTDATA類別:

case class testData(StartDate: DateTime) { } 

這裏是我的錯誤:

異常線程 「main」

java.lang.UnsupportedOperationException: Schema for type org.joda.time.DateTime is not supported 
    at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:153) 
    at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:29) 
    at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$schemaFor$1.apply(ScalaReflection.scala:128) 
    at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$schemaFor$1.apply(ScalaReflection.scala:126) 
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
    at scala.collection.immutable.List.foreach(List.scala:318) 
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) 
    at scala.collection.AbstractTraversable.map(Traversable.scala:105) 
    at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:126) 
    at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:29) 
    at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:64) 
    at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:29) 
    at org.apache.spark.sql.SQLContext.createDataFrame(SQLContext.scala:361) 
    at org.apache.spark.sql.SQLImplicits.rddToDataFrameHolder(SQLImplicits.scala:47) 
    at com.projs.poc.spark.ml.ProcessCSV$delayedInit$body.apply(ProcessCSV.scala:37) 

回答

4
  1. 正如你可以在SQL星火日期the official documentation使用java.sql.Timestamp中表示讀取。如果你想使用喬達時間,你必須輸出轉換爲正確的類型

  2. SparkSQL都能應付自如使用壓鑄類標準的日期格式:

    sc.parallelize(Seq(Tuple1("2016-01-11 00:01:02"))) 
        .toDF("dt") 
        .select($"dt".cast("timestamp")) 
    
+0

我想使用喬達時間,想知道我在去錯了字符串轉換爲日期時間 – rk1113

+0

'org.joda.time.DateTime'!''java.sql.Timestamp' – zero323

+0

但我在哪裏使用java.sql.Timestamp作爲我的主類和TestData Class I m使用joda。 time.DateTime? – rk1113

1

感謝zero323的解決方案。我用的java.sql.Timestamp,這裏是我修改

val dateYMD: java.sql.Timestamp = new java.sql.Timestamp(DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss").parseDateTime(p(8)).getMillis) 
testData(dateYMD)}.toDF().show() 

代碼,並改變了我的班

case class testData(GamingDate: java.sql.Timestamp) { } 
相關問題