2017-03-21 55 views
0

spark-cassandra函數leftJoinWithCassandraTable在spark-shell中工作正常,但在sbt中封裝時我收到下面提到的錯誤。sbt在左邊的包錯誤與CassandraTable的火花cassandra

我的Scala代碼片段

case class time_struct(id: String, month: Int, day: Int, platform: String, type_1: String, title:String, 
         time: Long) 
val rdd = time_data.mapPartitions(data => 
      data.map(row => 
       time_struct(row.getString(0),row.getInt(1),row.getInt(2),row.getString(3),row.getString(4),row.getString(5),row.getDouble(6).toLong))) 
val join = rdd.leftJoinWithCassandraTable("ks1", "tbl1",SomeColumns("time"), 
      SomeColumns("id")) 

$ SBT包

[error] Test.scala:187: could not find implicit value for parameter rwf: com.datastax.spark.connector.writer.RowWriterFactory[time_struct] 
[error]  val join = rdd.leftJoinWithCassandraTable("ks1", "tbl1",SomeColumns("time"), 
[error]           ^

build.sbt

scalaVersion:= 「2.11.8」

scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8") 

libraryDependencies ++= { 
    val sparkV = "2.1.0" 

    Seq(
    "org.apache.spark" %% "spark-core" % sparkV % "provided", 
    "org.apache.spark" %% "spark-sql" % sparkV % "provided", 

    "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.0-RC1", 
    "com.databricks" %% "spark-csv" % "1.5.0" 
) 
} 

libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1" 

環境:

斯卡拉2.11.8

火花2.1.0

SBT 0.13.13

回答

1

我得到了我的錯誤,我已經宣佈的主要功能內部的情況類。 我把案例類定義移出主函數後,它成功編譯。