0
以下官方文檔的簡單代碼:連接到Hbase1.2火花2.1
public static void main(String[] args) throws Exception {
SparkConf conf = new SparkConf().setAppName("MyApp")
.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
JavaSparkContext sc = new JavaSparkContext(conf);
Configuration cfg = HBaseConfiguration.create();
// cfg.set("hbase.zookeeper.quorum", "localhost");
JavaHBaseContext hc = new JavaHBaseContext(sc, cfg);
JavaRDD<List<String>> rdd = sc.parallelize(Arrays.asList(Tom, Jerry));
System.out.println(rdd.collect());
}
及POM Maven中:
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-spark</artifactId>
<version>2.0.0-alpha-1</version>
</dependency>
我得到一個錯誤:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
如何解決那?