1
我試圖在mac OS Sierra上使用eclipse運行我的scala應用程序,並且在運行時出現此錯誤。在Mac上初始化SparkContext時出錯
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/09/20 11:32:29 INFO StreamingExamples: Setting log level to [WARN] for
streaming example. To override add a custom log4j.properties to the classpath.
17/09/20 11:32:30 WARN NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
17/09/20 11:32:31 WARN Utils: Your hostname, 127.0.0.1 resolves to a loopback
address: 127.0.0.1; using 10.254.169.69 instead (on interface en0)
17/09/20 11:32:31 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/09/20 11:32:33 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Could not parse Master URL: 'Local[*]'
at
org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskSchedule (SparkContext.scala:2735)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:522) at
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
at bigdata.spark_applications.mapr$.main(mapr.scala:30)
at bigdata.spark_applications.mapr.main(mapr.scala)
17/09/20 11:32:33 WARN MetricsSystem: Stopping a MetricsSystem that is not running
Exception in thread "main" org.apache.spark.SparkException: Could not parse Master URL: 'Local[*]' at
org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2735)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:522)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
at bigdata.spark_applications.mapr$.main(mapr.scala:30)
at bigdata.spark_applications.mapr.main(mapr.scala)
請任何人都可以幫助我。謝謝