看來2.11的版本已損壞,您應該將其報告給項目。不知道如何解決它現在。
➜ spark-cassandra-connector git:(master) sbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/dev/oss/spark-cassandra-connector/project
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots
Scala: 2.10.5 [To build against Scala 2.11 use '-Dscala-2.11=true']
Scala Binary: 2.10
Java: target=1.7 user=1.8.0_66
[info] Set current project to root (in build file:/Users/jacek/dev/oss/spark-cassandra-connector/)
[root]> update
...
[info] Done updating.
[info] Done updating.
[success] Total time: 314 s, completed Dec 2, 2015 10:26:01 AM
[root]>
➜ spark-cassandra-connector git:(master) sbt -Dscala-2.11=true
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/dev/oss/spark-cassandra-connector/project
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots
Scala: 2.11.7
Scala Binary: 2.11
Java: target=1.7 user=1.8.0_66
[info] Set current project to root (in build file:/Users/jacek/dev/oss/spark-cassandra-connector/)
[root]> update
...
[error] impossible to get artifacts when data has not been loaded. IvyNode = org.slf4j#slf4j-log4j12;1.7.6
...
[trace] Stack trace suppressed: run last spark-cassandra-connector-embedded/*:update for the full output.
[error] (spark-cassandra-connector-embedded/*:update) java.lang.IllegalStateException: impossible to get artifacts when data has not been loaded. IvyNode = org.slf4j#slf4j-log4j12;1.7.6
[error] Total time: 9 s, completed Dec 2, 2015 10:27:19 AM
我提出了一個問題https://datastax-oss.atlassian.net/browse/SPARKC-295。
我說你說得對,Jacek。有一個類似的JIRA錯誤:https://datastax-oss.atlassian.net/browse/SPARKC-249但它只在Spark 1.5.0中修復,而不是1.5.1。我會嘗試1.5.0,看看它是否工作。 –
不要這樣想。我們尚未觸碰Spark。我正在報告一個問題。 –
好的 - 謝謝。我在Spark和Scala版本中遇到了幾個問題 - 您必須爲Scala 2.11手動構建Spark,您必須手動構建Spark-Connector程序集,等等。我想可能會更容易退回到舊版本的Scala/Spark等,以消除這些手動步驟中的一些,即使這意味着在我們可以使用一些很酷的新功能之前必須等待。 –