2
我想創建一個火花UDF從用戶定義的案例類中提取(鍵,值)對的映射。任何類型的架構不被支持
scala函數似乎工作正常,但是當我嘗試將其轉換爲spark2.0中的UDF時,我遇到了「Schema for any type is not supported」錯誤。
case class myType(c1: String, c2: Int)
def getCaseClassParams(cc: Product): Map[String, Any] = {
cc
.getClass
.getDeclaredFields // all field names
.map(_.getName)
.zip(cc.productIterator.to) // zipped with all values
.toMap
}
但是,當我嘗試實例化的函數值,因爲它導致了以下錯誤UDF -
val ccUDF = udf{(cc: Product, i: String) => getCaseClassParams(cc).get(i)}
java.lang.UnsupportedOperationException: Schema for type Any is not supported
at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:716)
at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:668)
at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:654)
at org.apache.spark.sql.functions$.udf(functions.scala:2841)