2016-09-07 63 views
0

我是新手在CouchBase.I我試圖將數據寫入到本地mode.My示例代碼CouchBase跟隨,如何使用Spark&Scala將數據寫入CouchBase?

val cfg = new SparkConf() 
.setAppName("couchbaseQuickstart") 
.setMaster("local[*]") 
.set("com.couchbase.bucket.MyBucket","pwd") 

    val sc = new SparkContext(cfg) 
    val doc1 = JsonDocument.create("doc1", JsonObject.create().put("some","content")) 
    val doc2 = JsonArrayDocument.create("doc2", JsonArray.from("more", "content", "in", "here")) 
    val data = sc.parallelize(Seq(doc1, doc2)) 

但我不能訪問data.saveToCouchbase()。

我使用星火1.6.1 &斯卡拉2.11.8

我給下面的built.sbt依賴

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.6.1" 
libraryDependencies += "com.couchbase.client" % "spark-connector_2.11" % "1.2.1" 

我如何編寫使用星火&斯卡拉數據到CouchBase?

回答

0

看起來你只是缺少一個import語句將讓您在RDDS和dataframes使用Couchbase功能:

import com.couchbase.spark._ 

val cfg = new SparkConf() 
.setAppName("couchbaseQuickstart") 
.setMaster("local[*]") 
.set("com.couchbase.bucket.MyBucket","pwd") 

val sc = new SparkContext(cfg) 
val doc1 = JsonDocument.create("doc1", 

JsonObject.create().put("some","content")) 
val doc2 = JsonArrayDocument.create("doc2", JsonArray.from("more", "content", "in", "here")) 

val data = sc.parallelize(Seq(doc1, doc2)) 

data.saveToCouchbase() 
相關問題