2016-02-20 112 views
1

我有一樣行HDFS CSV文件:SQOOP導出CSV到MySQL失敗

"2015-12-01","Augusta","46728.0","1" 

我想導出這個文件到MySQL表。

CREATE TABLE test.events_top10(
    dt VARCHAR(255), 
    name VARCHAR(255), 
    summary VARCHAR(255), 
    row_number VARCHAR(255) 
); 

用命令:

sqoop export --table events_top10 --export-dir /user/hive/warehouse/result --escaped-by \" --connect ... 

此命令失敗,錯誤:

Error: java.io.IOException: Can't export data, please check failed map task logs 
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112) 
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39) 
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) 
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) 
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 
Caused by: java.lang.RuntimeException: Can't parse input data: '2015-12-02,Ashburn,43040.0,9' 
    at events_top10.__loadFromFields(events_top10.java:335) 
    at events_top10.parse(events_top10.java:268) 
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83) 
    ... 10 more 
Caused by: java.util.NoSuchElementException 
    at java.util.ArrayList$Itr.next(ArrayList.java:834) 
    at events_top10.__loadFromFields(events_top10.java:320) 
    ... 12 more 

如果我不--escaped按\」比MySQL表參數使用包含這樣的行

"2015-12-01" | "Augusta"  | "46728.0" | "1" 

你能解釋一下如何導出CSV文件到MySQL表沒有雙引號?

回答

1

我必須同時使用--escaped按\和--enclosed,用「\「」 所以正確的命令是

sqoop export --table events_top10 --export-dir /user/hive/warehouse/result --escaped-by '\\' --enclosed-by '\"' --connect ... 

欲瞭解更多信息,請參閱official documentation