2012-06-20 62 views
2

退出,當我鍵入命令:(Sqoop導入)錯誤tool.ImportTool:遇到IOException異常運行導入作業:java.io.IOException異常:蜂巢與狀態9

./sqoop-import --connect jdbc:mysql://localhost/sqoop2 -table sqeep2 -m 1 -hive-import 

當執行此命令:

[email protected]:/opt/sqoop/bin$ ./sqoop-import --connect jdbc:mysql://localhost/sqoop2 -table sqeep2 -m 1 -hive-import 
12/06/20 10:00:44 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 
12/06/20 10:00:44 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 
12/06/20 10:00:44 INFO tool.CodeGenTool: Beginning code generation 
12/06/20 10:00:45 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1 
12/06/20 10:00:45 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1 
12/06/20 10:00:45 INFO orm.CompilationManager: HADOOP_HOME is /opt/hadoop 
Note: /tmp/sqoop-hadoop/compile/dedd7d201dfca40c5cd5dee4919e0487/sqeep2.java uses or overrides a deprecated API. 
Note: Recompile with -Xlint:deprecation for details. 
12/06/20 10:00:46 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/dedd7d201dfca40c5cd5dee4919e0487/sqeep2.jar 
12/06/20 10:00:46 WARN manager.MySQLManager: It looks like you are importing from mysql. 
12/06/20 10:00:46 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 
12/06/20 10:00:46 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 
12/06/20 10:00:46 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 
12/06/20 10:00:46 INFO mapreduce.ImportJobBase: Beginning import of sqeep2 
12/06/20 10:00:46 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1 
12/06/20 10:00:46 INFO util.NativeCodeLoader: Loaded the native-hadoop library 
12/06/20 10:00:46 INFO mapred.JobClient: Running job: job_201206200849_0006 
12/06/20 10:00:47 INFO mapred.JobClient: map 0% reduce 0% 
12/06/20 10:00:52 INFO mapred.JobClient: map 100% reduce 0% 
12/06/20 10:00:53 INFO mapred.JobClient: Job complete: job_201206200849_0006 
12/06/20 10:00:53 INFO mapred.JobClient: Counters: 11 
12/06/20 10:00:53 INFO mapred.JobClient: Job Counters 
12/06/20 10:00:53 INFO mapred.JobClient:  SLOTS_MILLIS_MAPS=4255 
12/06/20 10:00:53 INFO mapred.JobClient:  Total time spent by all reduces waiting after reserving slots (ms)=0 
12/06/20 10:00:53 INFO mapred.JobClient:  Total time spent by all maps waiting after reserving slots (ms)=0 
12/06/20 10:00:53 INFO mapred.JobClient:  Launched map tasks=1 
12/06/20 10:00:53 INFO mapred.JobClient:  SLOTS_MILLIS_REDUCES=0 
12/06/20 10:00:53 INFO mapred.JobClient: FileSystemCounters 
12/06/20 10:00:53 INFO mapred.JobClient:  FILE_BYTES_READ=106 
12/06/20 10:00:53 INFO mapred.JobClient:  FILE_BYTES_WRITTEN=41020 
12/06/20 10:00:53 INFO mapred.JobClient: Map-Reduce Framework 
12/06/20 10:00:53 INFO mapred.JobClient:  Map input records=3 
12/06/20 10:00:53 INFO mapred.JobClient:  Spilled Records=0 
12/06/20 10:00:53 INFO mapred.JobClient:  Map output records=3 
12/06/20 10:00:53 INFO mapred.JobClient:  SPLIT_RAW_BYTES=87 
12/06/20 10:00:53 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 6.5932 seconds (0 bytes/sec) 
12/06/20 10:00:53 INFO mapreduce.ImportJobBase: Retrieved 3 records. 
12/06/20 10:00:53 INFO hive.HiveImport: Loading uploaded data into Hive 
12/06/20 10:00:53 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1 
12/06/20 10:00:53 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1 
12/06/20 10:00:53 WARN hive.TableDefWriter: Column price had to be cast to a less precise type in Hive 
12/06/20 10:00:53 WARN hive.TableDefWriter: Column design_date had to be cast to a less precise type in Hive 
12/06/20 10:00:54 INFO hive.HiveImport: Hive history file=/tmp/hadoop/hive_job_log_hadoop_201206201000_695261712.txt 
12/06/20 10:01:00 INFO hive.HiveImport: FAILED: Error in metadata: MetaException(message:Got exception: java.io.FileNotFoundException File file:/user/hive/warehouse/sqeep2 does not exist.) 
12/06/20 10:01:00 INFO hive.HiveImport: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask 
12/06/20 10:01:00 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 9 
    at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:326) 
    at com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:276) 
    at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:218) 
    at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:362) 
    at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423) 
    at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144) 
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) 
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) 
    at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180) 
    at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:218) 
    at com.cloudera.sqoop.Sqoop.main(Sqoop.java:228) 

[email protected]:/opt/sqoop/bin$ 

我的sqoop命令有什麼問題?

請任何解決方案:)謝謝 的

回答

0

是否表sqeep2存在嗎?

12/06/20 10:01:00 INFO hive.HiveImport: FAILED: Error in metadata: MetaException(message:Got exception: java.io.FileNotFoundException File file:/user/hive/warehouse/sqeep2 does not exist.)

+0

其他/用戶/配置單元/倉庫也影響sqoop導入表? 謝謝:) –

+0

不知道我明白你的問題 –

+0

對不起,在上面的情況下,實際上我還沒有建立一個目錄到它的倉庫。是否也有影響? –

2

我已經改性蜂房-site.xml中,使得 的hive.metastore.warehouse.dir值爲/home/hadoop/data/hive/warehouse

然後它工作。

0

您的sqoop語法錯誤,在工具特定參數之前有兩個破折號。嘗試:

./sqoop進口--connect的jdbc:mysql的://本地主機/ sqoop2 --table sqeep2 -m 1 --hive進口

0

步驟 - 1:設置你的.bashrc文件$ HADOOP_HOME,$ HIVE_HOME,$ SQOOP_HOME

步驟 - 2:啓動hiveserver & metastore

 bin/hive --service hiveserver 
     bin/hive --service metastore 

試試這個命令的