2015-11-19 77 views
1

我試圖在集羣上啓動Spark作業(Spark 1.4.0)。無論是從命令行還是Eclipse,我都會在Spark Utils類中得到關於withDummyCallSite函數丟失的錯誤。在maven依賴關係中,我可以看到spark-core_2.10-1.4.0.jar被加載,應該包含這個函數。我正在運行Java 1.7,與之前編譯代碼的Java版本相同。我可以在Spark Master監視器上看到該作業已啓動,因此它似乎不是防火牆問題。下面是我在控制檯中看到的錯誤(這兩個命令行和Eclipse):

ERROR 09:53:06,314 Logging.scala:75 -- Task 0 in stage 1.0 failed 4 times; aborting job 
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". 
SLF4J: Defaulting to no-operation (NOP) logger implementation 
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. 
java.lang.NoSuchMethodError: org.apache.spark.util.Utils$.withDummyCallSite(Lorg/apache/spark/SparkContext;Lscala/Function0;)Ljava/lang/Object; 
    at org.apache.spark.sql.parquet.ParquetRelation2.buildScan(newParquet.scala:269) 
    at org.apache.spark.sql.sources.HadoopFsRelation.buildScan(interfaces.scala:530) 
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$8.apply(DataSourceStrategy.scala:98) 
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$8.apply(DataSourceStrategy.scala:98) 
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:266) 
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:265) 
    at org.apache.spark.sql.sources.DataSourceStrategy$.pruneFilterProjectRaw(DataSourceStrategy.scala:296) 
    at org.apache.spark.sql.sources.DataSourceStrategy$.pruneFilterProject(DataSourceStrategy.scala:261) 
    at org.apache.spark.sql.sources.DataSourceStrategy$.apply(DataSourceStrategy.scala:94) 
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58) 
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58) 
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371) 
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59) 
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.planLater(QueryPlanner.scala:54) 
    at org.apache.spark.sql.execution.SparkStrategies$HashAggregation$.apply(SparkStrategies.scala:162) 
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58) 
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58) 
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371) 
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59) 
    at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan$lzycompute(SQLContext.scala:932) 
    at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan(SQLContext.scala:930) 
    at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan$lzycompute(SQLContext.scala:936) 
    at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan(SQLContext.scala:936) 
    at org.apache.spark.sql.DataFrame.collect(DataFrame.scala:1255) 
    at org.apache.spark.sql.DataFrame.count(DataFrame.scala:1269) 

(截斷日誌爲簡潔起見)

預先感謝任何指針!

回答

1

請檢查你的類是如何由maven使用鍵(CNTR + Shift + T)解析的。確保它不是從你的類路徑中的兩個不同的jar中解析出來的。

如果您的類是從任何傳遞性依賴項引用的,請使用您需要的版本將所需的jar添加爲直接依賴項。

您可以參考這些鏈接以獲取更多參考。

mockito test gives no such method error when run as junit test but when jars are added manually in run confugurations, it runs well

Exception in thread "main" java.lang.NoSuchMethodError: org.slf4j.impl.StaticLoggerBinder.getSingleton()Lorg/slf4j/impl/StaticLoggerBinder

+0

感謝您的響應。我再次檢查了spark-core_2.10-1.4.0.jar的內容(通過我在pom.xlm中指定的依賴項下載),並且使用了DummyCallSite函數確實缺少。我從http://mvnrepository.com手動下載了這個jar,並替換了jar,現在問題消失了。它是完全相同的版本(2.10-1.4.0),不知道爲什麼該功能首先失蹤。 – bbtus

+0

沒問題!很高興聽到手動更換jar解決了您的問題。實際上,在這種情況下,您可以直接從本地cahce/repo中刪除依賴項的文件夾結構(例如,C:/ Users/{您的用戶名} /。m2/reposirty/org/apache/spark)。在刪除這個jar之後,maven從你的遠程倉庫下載新的副本,你可以在你的本地倉庫中獲得指定版本的新副本。 – asg