開始火花EC2羣集後,我開始sparkR從/根與sparkR:驗證號碼起作用工作者節點
$ ./spark/bin/sparkR
所得消息的幾行包括:
16/11/20 10:13:51 WARN SparkConf:
SPARK_WORKER_INSTANCES was detected (set to '1').
This is deprecated in Spark 1.0+.
Please instead use:
- ./spark-submit with --num-executors to specify the number of executors
- Or set SPARK_EXECUTOR_INSTANCES
- spark.executor.instances to configure the number of instances in the spark config.
所以,遵循該建議,我添加了最後一行spark-defaults.conf
$ pwd
/root/spark/conf
$ cat spark-defaults.conf
spark.executor.memory 512m
spark.executor.extraLibraryPath /root/ephemeral-hdfs/lib/native/
spark.executor.extraClassPath /root/ephemeral-hdfs/conf
spark.executor.instances 2
這導致消息不再是印刷。
在sparkR中,如何驗證將要訪問的工作節點的數量?
您能分享您正在使用的Spark配置參數嗎? –
好的,我按照建議將這行添加到spark-defaults.conf –