2016-12-29 111 views
1

當我嘗試在Mac本地啓動spark-shell時出現以下錯誤。由於在創建sc時發生錯誤,因此sc以後不可用。在Mac上啓動本地Spark-Shell時出錯

[[email protected]:data] $ spark-shell 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). 
16/12/29 13:57:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
16/12/29 13:57:12 ERROR SparkContext: Error initializing SparkContext. 
java.lang.reflect.InvocationTargetException 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
    at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:453) 
    at org.apache.spark.executor.Executor.<init>(Executor.scala:99) 
    at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:59) 
    at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:126) 
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:497) 
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2275) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823) 
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) 
    at $line3.$read$$iw$$iw.<init>(<console>:15) 
    at $line3.$read$$iw.<init>(<console>:31) 
    at $line3.$read.<init>(<console>:33) 
    at $line3.$read$.<init>(<console>:37) 
    at $line3.$read$.<clinit>(<console>) 
    at $line3.$eval$.$print$lzycompute(<console>:7) 
    at $line3.$eval$.$print(<console>:6) 
    at $line3.$eval.$print(<console>) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) 
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637) 
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) 
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) 
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) 
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) 
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) 
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:94) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) 
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) 
    at org.apache.spark.repl.Main$.doMain(Main.scala:68) 
    at org.apache.spark.repl.Main$.main(Main.scala:51) 
    at org.apache.spark.repl.Main.main(Main.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
Caused by: java.net.URISyntaxException: Malformed escape pair at index 38: spark://fe80:0:0:0:44cd:86ff:fe5b:cd90%7:61707/classes 
    at java.net.URI$Parser.fail(URI.java:2848) 
    at java.net.URI$Parser.scanEscape(URI.java:2978) 
    at java.net.URI$Parser.scan(URI.java:3001) 
    at java.net.URI$Parser.parseAuthority(URI.java:3142) 
    at java.net.URI$Parser.parseHierarchical(URI.java:3097) 
    at java.net.URI$Parser.parse(URI.java:3053) 
    at java.net.URI.<init>(URI.java:588) 
    at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:48) 
    ... 63 more 
16/12/29 13:57:12 ERROR Utils: Uncaught exception in thread main 
java.lang.NullPointerException 
    at org.apache.spark.scheduler.local.LocalSchedulerBackend.org$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:158) 
    at org.apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:137) 
    at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:455) 
    at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1605) 
    at org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1798) 
    at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1287) 
    at org.apache.spark.SparkContext.stop(SparkContext.scala:1797) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:565) 
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2275) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823) 
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) 
    at $line3.$read$$iw$$iw.<init>(<console>:15) 
    at $line3.$read$$iw.<init>(<console>:31) 
    at $line3.$read.<init>(<console>:33) 
    at $line3.$read$.<init>(<console>:37) 
    at $line3.$read$.<clinit>(<console>) 
    at $line3.$eval$.$print$lzycompute(<console>:7) 
    at $line3.$eval$.$print(<console>:6) 
    at $line3.$eval.$print(<console>) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) 
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637) 
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) 
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) 
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) 
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) 
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) 
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:94) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) 
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) 
    at org.apache.spark.repl.Main$.doMain(Main.scala:68) 
    at org.apache.spark.repl.Main$.main(Main.scala:51) 
    at org.apache.spark.repl.Main.main(Main.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
16/12/29 13:57:12 WARN MetricsSystem: Stopping a MetricsSystem that is not running 
java.net.URISyntaxException: Malformed escape pair at index 38: spark://fe80:0:0:0:44cd:86ff:fe5b:cd90%7:61707/classes 
    at java.net.URI$Parser.fail(URI.java:2848) 
    at java.net.URI$Parser.scanEscape(URI.java:2978) 
    at java.net.URI$Parser.scan(URI.java:3001) 
    at java.net.URI$Parser.parseAuthority(URI.java:3142) 
    at java.net.URI$Parser.parseHierarchical(URI.java:3097) 
    at java.net.URI$Parser.parse(URI.java:3053) 
    at java.net.URI.<init>(URI.java:588) 
    at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:48) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
    at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:453) 
    at org.apache.spark.executor.Executor.<init>(Executor.scala:99) 
    at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:59) 
    at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:126) 
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:497) 
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2275) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831) 


    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823) 
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) 
    ... 47 elided 
<console>:14: error: not found: value spark 
     import spark.implicits._ 
      ^
<console>:14: error: not found: value spark 
     import spark.sql 
      ^
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.0.1 
     /_/ 

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101) 
Type in expressions to have them evaluated. 
Type :help for more information. 

scala> sc 
<console>:18: error: not found: value sc 
     sc 
    ^

回答

2

經過搜索和調查日誌,我懷疑這是因爲我的主機名在/ etc/hosts中找不到。將主機名從「koders.local」更改爲/ etc/hosts中存在的「koders」後,問題就解決了。

+0

具有相同的問題,但不適合我。 http://stackoverflow.com/questions/41914586/spark-fails-to-start-in-local-mode-when-disconnected-possible-bug-in-handling-i – Aliostad