1
SparkSQL中可能加入mysql和Oracle數據庫的數據嗎?我嘗試加入它們,但是在SPARK_CLASSPATH中設置多個jar(用於mysql和Oracle的jdbc驅動程序)時遇到一些麻煩。 這裏是我的代碼:如何從MySQL和Oracle加入SparkSQL數據?
import os
import sys
os.environ['SPARK_HOME']="/home/x/spark-1.5.2"
sys.path.append("/home/x/spark-1.5.2/python/")
try:
from pyspark import SparkContext, SparkConf
from pyspark.sql import SQLContext
MYSQL_DRIVER_PATH = "/home/x/spark-1.5.2/python/lib/mysql-connector-java-5.1.38-bin.jar"
MYSQL_CONNECTION_URL = "jdbc:mysql://192.111.333.999:3306/db?user=us&password=pasw"
ORACLE_DRIVER_PATH = "/home/x/spark-1.5.2/python/lib/ojdbc6.jar"
Oracle_CONNECTION_URL = "jdbc:oracle:thin:user/[email protected]:1521:xe"
# Define Spark configuration
conf = SparkConf()
conf.setMaster("local")
conf.setAppName("MySQL_Oracle_imp_exp")
# Initialize a SparkContext and SQLContext
sc = SparkContext(conf=conf)
#sc.addJar(MYSQL_DRIVER_PATH)
sqlContext = SQLContext(sc)
ora_tmp=sqlContext.read.format('jdbc').options(
url=Oracle_CONNECTION_URL,
dbtable="TABLE1",
driver="oracle.jdbc.OracleDriver"
).load()
ora_tmp.show()
tmp2=sqlContext.load(
source="jdbc",
path=MYSQL_DRIVER_PATH,
url=MYSQL_CONNECTION_URL,
dbtable="(select city,zip from TABLE2 limit 10) as tmp2",
driver="com.mysql.jdbc.Driver")
c_rows=tmp2.collect()
....
except Exception as e:
print e
sys.exit(1)
可能有人請幫我解決這個問題? 感謝提前:)