2017-08-11 48 views
1

我正嘗試從MapR DB中將數據提取到數據框中,然後使用鑽取工具在pyspark外殼中進行連接。建立連接以使用pyspark進行鑽取

這是我做我的pyspark殼:

`dataframe_mysql = sqlContext.read.format("jdbc").option("url", "jdbc:drill:zk=localhost:5181/drill/demo_mapr_com-drillbits;schema=dfs;").option("driver","org.apache.drill.jdbc.Driver").option("dbtable","select * from dfs.`/DDDE/jsondb/ruleengine/testtransactions").option("user","root").option("password","mapr").load()` 

不幸的是,我得到以下錯誤。

Traceback (most recent call last): 
    File "<stdin>", line 1, in <module> 
    File "/opt/mapr/spark/spark-1.6.3-bin-hadoop2.6/python/pyspark/sql/readwriter.py", line 139, in load 
    return self._df(self._jreader.load()) 
    File "/opt/mapr/spark/spark-1.6.3-bin-hadoop2.6/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 813, in __call__ 
    File "/opt/mapr/spark/spark-1.6.3-bin-hadoop2.6/python/pyspark/sql/utils.py", line 45, in deco 
    return f(*a, **kw) 
    File "/opt/mapr/spark/spark-1.6.3-bin-hadoop2.6/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value 
py4j.protocol.Py4JJavaError: An error occurred while calling o88.load. 
: java.lang.ClassNotFoundException: org.apache.drill.jdbc.Driver 
     at java.net.URLClassLoader$1.run(URLClassLoader.java:366) 
     at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
     at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
     at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
     at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:38) 
     at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:45) 
     at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:45) 
     at scala.Option.foreach(Option.scala:236) 
     at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.createConnectionFactory(JdbcUtils.scala:45) 
     at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:120) 
     at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91) 
     at org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:57) 
     at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158) 
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231) 
     at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381) 
     at py4j.Gateway.invoke(Gateway.java:259) 
     at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133) 
     at py4j.commands.CallCommand.execute(CallCommand.java:79) 
     at py4j.GatewayConnection.run(GatewayConnection.java:209) 
     at java.lang.Thread.run(Thread.java:745 

任何想法,我哪裏錯了?

編輯:sqlline我能夠按照檢索數據框:

!connect jdbc:drill:zk=localhost:31010/drill/demo_mapr_com-drillbits;schema=dfs; 
!connect jdbc:drill:zk=localhost:5181/drill/demo_mapr_com-drillbits;schema=dfs; 
select * from dfs.`/DDDE/jsondb/ruleengine/testtransactions`; 
+0

是您的集羣中的鑽取jdbc驅動程序? – eliasah

+0

@eliasah我不知道如何檢查。順便請檢查編輯。 – AYa

回答

2

鑽取JDBC驅動程序JAR文件必須在客戶機上,所以你可以配置爲應用程序或第三方驅動程序您打算使用的工具。你可以爲接下來的驅動程序:

複製從下面鑽安裝目錄中的drill-jdbc-all JAR文件到您的工作目錄,並繼續執行,隨後啓動腳本:

./bin/spark-submit --jars drill-jdbc-all-<version>.jar your_spark_script.py 

如果您正在使用pyspark,你應做到以下幾點:

pyspark --jars drill-jdbc-all-<version>.jar 

如果你不能找到JAR,在鑽目錄安裝:

$> tree jars/jdbc-driver/ 
jars/jdbc-driver/ 
└── drill-jdbc-all-1.10.0.jar # this is it 
+0

我在shell中運行的命令截至目前不在python腳本中。 – AYa

+0

謝謝。那麼我在'drill-1.10.0/jars'中有很多來自鑽孔安裝的罐子。我沒有發現'drill-jdbc-all jar',但是我確實有'drill-jdbc-1.10.0.jar'。這是你想讓我複製的嗎? – AYa

+0

@AYa我再次更新了我的答案 – eliasah