2016-08-04 78 views
1

我是ElasticSearch的新手,我正在嘗試編寫一些Apache Spark代碼來將一些數據保存到ElasticSearch中。我已經進入以下行到SparkShell:ElasticSearch Spark Error

import org.elasticsearch.spark._ 
val myMap = Map("France" -> "FRA", "United States" -> "US") 
val myRDD = sc.makeRDD(Seq(myMap)) 
myRDD.saveToEs("Country/Abbrv") 

錯誤:

org.elasticsearch.hadoop.EsHadoopIlegalArgumentException: Cannot determine write shards for [Country/Abbrv]; likely its format is incorrect (maybe it contains illegal characters?) 

星火2.0.0 ElasticSearch星火2.3.4

任何想法?

回答

0

問題是我在啓動spark shell之前沒有設置--conf var。它需要看起來如下:

spark-shell --jars {path}/elasticsearch-spark_2.11-2.3.4.jar --conf spark.es.resource=Country/Abbrv 
相關問題