2011-08-11 122 views
0

我嘗試從mysql導入數據庫到Hive with Hadoop,並自動創建表並使用Sqoop中的「--hive-import」命令加載數據。Sqoop命令--hive-import失敗

我使用下面的命令,當我執行此命令錯誤發生這樣的

8/11 23:02:49 INFO hive.HiveImport: Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.security.UserGroupInformation.login(Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/hadoop/security/UserGroupInformation; 
11/08/11 23:02:49 INFO hive.HiveImport:  at org.apache.hadoop.hive.shims.Hadoop20Shims.getUGIForConf(Hadoop20Shims.java:448) 
11/08/11 23:02:49 INFO hive.HiveImport:  at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:51) 
11/08/11 23:02:49 INFO hive.HiveImport:  at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62) 
11/08/11 23:02:49 INFO hive.HiveImport:  at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) 
11/08/11 23:02:49 INFO hive.HiveImport:  at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:222) 
11/08/11 23:02:49 INFO hive.HiveImport:  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:241) 
11/08/11 23:02:49 INFO hive.HiveImport:  at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:463) 
11/08/11 23:02:49 INFO hive.HiveImport:  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
11/08/11 23:02:49 INFO hive.HiveImport:  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
11/08/11 23:02:49 INFO hive.HiveImport:  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
11/08/11 23:02:49 INFO hive.HiveImport:  at java.lang.reflect.Method.invoke(Method.java:616) 
11/08/11 23:02:49 INFO hive.HiveImport:  at org.apache.hadoop.util.RunJar.main(RunJar.java:186) 
11/08/11 23:02:49 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 1 
    at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:326) 
    at com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:276) 
    at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:218) 
    at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:362) 
    at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423) 
    at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144) 
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) 
    at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180) 
    at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219) 
    at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228) 
    at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237) 

有什麼錯我sqoop命令執行與sqoop

./sqoop import --connect jdbc:mysql://localhost/testhive --table temenan -m 1 --hive-import --username anwar -P 

進口?或者在sqoop或hive中是否有其他配置?

請幫助我

回答

0

我有降級CDH3.1到CDH3.0,然後我改變蜂巢metastore靜態如下:

<property> 
    <name>javax.jdo.option.ConnectionURL</name> 
    <value>jdbc:derby:;databaseName=/home/hadoop/metastore_db;create=true</value> 
    <description>JDBC connect string for a JDBC metastore</description> 
</property> 

然後一切運作良好:)

Thnks全部

1

您必須添加--hive表參數和提表蜂巢。以下命令將自動創建名稱爲hive的表格temenan_hive

./sqoop import --connect jdbc:mysql://localhost/testhive --table temenan -m 1 --username anwar -P --hive-import --hive-table temenan_hive