我'Hadoop的新手,和我'嘗試本教程: 蜂巢--auxpath/cygdrive/C/Hadoop的: https://cwiki.apache.org/confluence/display/Hive/HBaseIntegrationHadoop的集成蜂房INSERT查詢
1.Starting蜂巢與參數做成功/hive-0.9.0/lib/hive-hbase-handler-0.9.0.jar,/cygdrive/c/javaHBase/hbase-0.94.6/hbase-0.94.6.jar,/cygdrive/c/Hadoop/hive -0.9.0/LIB /動物園管理員-3.4.3.jar,/ cygdrive/C/Hadoop的/蜂房0.9.0/LIB /番石榴r09.jar -hiveconf hbase.master =本地主機:60010
2。開始hbase是成功的。
3. 「CREATE TABLE hbase_table_1」 成功完成
4,我用命令列表顯表驗證,一切正常
這裏是我的問題 「INSERT OVERWRITE TABLE hbase_table_1 SELECT * FROM捅在哪裏foo = 98;「
我得到這個錯誤信息搜索 「HTP://本地主機:50060 /任務日誌attemptid ......」
java.lang.ClassNotFoundException: org/apache/hadoop/hive/hbase/HBaseSerDe
Continuing ...
java.lang.ClassNotFoundException:
org/apache/hadoop/hive/hbase/HiveHBaseTableInputFormat
Continuing ...
java.lang.ClassNotFoundException:
org/apache/hadoop/hive/hbase/HiveHBaseTableOutputFormat
Continuing ...
java.lang.NullPointerException
Continuing ...
java.lang.NullPointerException
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.initializeOp(FileSinkOperator.java:280)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:357)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:433)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:389)
at org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:62)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:357)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:433)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:389)
at org.apache.hadoop.hive.ql.exec.FilterOperator.initializeOp(FilterOperator.java:78)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:357)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:433)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:389)
...
我試圖複製蜂巢罐子HBase的安裝和副VESA ... 注:我添加了蜂巢命令必須得Jar文件:ADD JAR C:\蜂房HBase的處理程序0.9.0.jar,等
HBase的版本:0.94.6 蜂巢版本:0.9.0
有沒有額外的出口?或配置? 我需要幫助! 非常感謝!
它工作!非常感謝! – LMHadoop 2013-04-12 15:27:15