我一直在嘗試使用spark-shell的spark。我所有的數據都是在sql中。spark jobserver錯誤classnotfoundexception
I used to include external jars using the --jars flag like /bin/spark-shell --jars /path/to/mysql-connector-java-5.1.23-bin.jar --master spark://sparkmaster.com:7077
I have included it in class path by changing the bin/compute-classpath.sh file
I was running succesfully with this config.
現在,當我通過jobserver運行一個獨立的工作。我收到以下錯誤消息
result: {
"message" : "com.mysql.jdbc.Driver"
"errorClass" : "java.lang.classNotFoundException"
"stack" :[.......]
}
我已經將jar文件包含在我的local.conf文件中,如下所示。 上下文設置{ ..... 從屬JAR-URI的= [ 「文件:///絕對/路徑/到/與/ jar文件」] ...... }