2016-06-22 54 views
1

我有我的build.sbt文件如下:SparkContext沒有SBT運行初始化後

name := "hello" 

version := "1.0" 

scalaVersion := "2.11.8" 

val sparkVersion = "1.6.1" 

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion, 
    "org.apache.spark" %% "spark-streaming" % sparkVersion, 
    "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion 
) 

我也有example.scala在src/main/scala/example.scala

import org.apache.spark._ 
import org.apache.spark.SparkContext._ 

object WordCount { 
    def main(args: Array[String]) { 
     val conf = new SparkConf().setAppName("wordCount").setMaster("local") 
     val sc = new SparkContext(conf) 
     val input = sc.textFile("food.txt") 
     val words = input.flatMap(line => line.split(" ")) 
     val counts = words.map(word => (word, 1)).reduceByKey{case (x, y) => x + y} 
     counts.saveAsTextFile("output.txt") 
    } 
} 

出於某種原因,當我在我的根做sbt run目錄(不是src/main/scala)我得到錯誤:

[info] Running WordCount 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
16/06/21 22:05:08 INFO SparkContext: Running Spark version 1.6.1 
16/06/21 22:05:08 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
16/06/21 22:05:09 ERROR SparkContext: Error initializing SparkContext. 
java.net.UnknownHostException: LM-SFA-11002982: LM-SFA-11002982: nodename nor servname provided, or not known 
    at java.net.InetAddress.getLocalHost(InetAddress.java:1475) 
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:788) 
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:781) 
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:781) 
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:838) 
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:838) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:838) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:420) 
    at WordCount$.main(exam.scala:8) 
    at WordCount.main(exam.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at sbt.Run.invokeMain(Run.scala:67) 
    at sbt.Run.run0(Run.scala:61) 
    at sbt.Run.sbt$Run$$execute$1(Run.scala:51) 
    at sbt.Run$$anonfun$run$1.apply$mcV$sp(Run.scala:55) 
    at sbt.Run$$anonfun$run$1.apply(Run.scala:55) 
    at sbt.Run$$anonfun$run$1.apply(Run.scala:55) 
    at sbt.Logger$$anon$4.apply(Logger.scala:84) 
    at sbt.TrapExit$App.run(TrapExit.scala:248) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.net.UnknownHostException: LM-SFA-11002982: nodename nor servname provided, or not known 
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method) 
    at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901) 
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1295) 
    at java.net.InetAddress.getLocalHost(InetAddress.java:1471) 
    ... 23 more 
16/06/21 22:05:09 INFO SparkContext: Successfully stopped SparkContext 

有人可以向我解釋這個錯誤中說明的問題嗎?這是因爲我的依賴沒有正確安裝,還是因爲另一個原因?

回答

5

看起來您的系統的主機名無法解析爲IP地址。

作爲[漂亮跛]解決方法可以嘗試:

echo "127.0.0.1 LM-SFA-11002982" | sudo tee -a /etc/hosts 
+0

您使用靜態可以改變的主機名。這可能發生,例如當你連接你的機器到另一個網絡。根據您的操作系統,有更好的方法來設置主機名解析,例如通過'hostname'命令。 –

+0

在Mac上,'sudo hostname LM-SFA-11002982.local'完成這項工作 –

相關問題