回答

1

你在這些鏈接的線程中做了完整的「git clone」步驟嗎?你是否需要實際修改jblas?如果不是的話,你應該把它們從maven中心使用--packages org.jblas:jblas:1.2.4,而不用git clonemvn install;以下的罰款對我來說是新的Dataproc集羣:

$ spark-shell --packages org.jblas:jblas:1.2.4 
Ivy Default Cache set to: /home/dhuo/.ivy2/cache 
The jars for the packages stored in: /home/dhuo/.ivy2/jars 
:: loading settings :: url = jar:file:/usr/lib/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml 
org.jblas#jblas added as a dependency 
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 
    confs: [default] 
    found org.jblas#jblas;1.2.4 in central 
downloading https://repo1.maven.org/maven2/org/jblas/jblas/1.2.4/jblas-1.2.4.jar ... 
    [SUCCESSFUL ] org.jblas#jblas;1.2.4!jblas.jar (605ms) 
:: resolution report :: resolve 713ms :: artifacts dl 608ms 
    :: modules in use: 
    org.jblas#jblas;1.2.4 from central in [default] 
    --------------------------------------------------------------------- 
    |     |   modules   || artifacts | 
    |  conf  | number| search|dwnlded|evicted|| number|dwnlded| 
    --------------------------------------------------------------------- 
    |  default  | 1 | 1 | 1 | 0 || 1 | 1 | 
    --------------------------------------------------------------------- 
:: retrieving :: org.apache.spark#spark-submit-parent 
    confs: [default] 
    1 artifacts copied, 0 already retrieved (10360kB/29ms) 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 
ivysettings.xml file not found in HIVE_HOME or HIVE_CONF_DIR,/etc/hive/conf.dist/ivysettings.xml will be used 
Spark context Web UI available at http://10.240.2.221:4040 
Spark context available as 'sc' (master = yarn, app id = application_1501548510890_0005). 
Spark session available as 'spark'. 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.2.0 
     /_/ 

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_131) 
Type in expressions to have them evaluated. 
Type :help for more information. 

scala> import org.jblas.DoubleMatrix 
import org.jblas.DoubleMatrix 

scala> :quit 

此外,如果您需要通過Dataproc的作業提交API提交要求「套餐」的工作,然後因爲--packages是在不同的火花發射實際上是語法糖腳本而不是Spark作業的屬性,則需要使用等效的spark.jars.packages,而不是在這種情況下,如explained in this StackOverflow answer

+0

非常感謝您的及時回覆。你的答案解決了我的問題。 +1加答案接受:) – santobedi

相關問題