我有一個在127.0.0.1:9200上偵聽的elasticsearch docker鏡像,我使用sense和kibana進行了測試,工作正常,我能夠索引和查詢文檔。現在,當我嘗試從火花應用Spark應用程序無法寫入在docker中運行的elasticsearch集羣
val sparkConf = new SparkConf().setAppName("ES").setMaster("local")
sparkConf.set("es.index.auto.create", "true")
sparkConf.set("es.nodes", "127.0.0.1")
sparkConf.set("es.port", "9200")
sparkConf.set("es.resource", "spark/docs")
val sc = new SparkContext(sparkConf)
val sqlContext = new SQLContext(sc)
val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
val airports = Map("arrival" -> "Otopeni", "SFO" -> "San Fran")
val rdd = sc.parallelize(Seq(numbers, airports))
rdd.saveToEs("spark/docs")
連接失敗寫微博,並不斷重試
16/07/11 17:20:07 INFO HttpMethodDirector: I/O exception (java.net.ConnectException) caught when processing request: Operation timed out 16/07/11 17:20:07 INFO HttpMethodDirector: Retrying request
我用ip地址由搬運工人給檢查的elasticsearch圖像嘗試,那也不起作用。但是,當我使用彈性搜索的本地安裝時,Spark App運行良好。有任何想法嗎?