2017-04-07 77 views
1

我試圖本地CSV文件導出到MySQL表「測試」:sqoop出口本地CSV到MySQL錯誤MapReduce的

$ sqoop export -fs local -jt local --connect jdbc:mysql://172.16.21.64:3306/cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username --password password --table test --export-dir file:///home/username/test.csv 

不過,我有一個奇怪的錯誤說mapreduce.tar.gz未發現:

Warning: /usr/hdp/2.5.0.0-1245/hbase does not exist! HBase imports will fail. 
Please set $HBASE_HOME to the root of your HBase installation. 
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail. 
Please set $ACCUMULO_HOME to the root of your Accumulo installation. 
17/04/07 14:22:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245 
17/04/07 14:22:14 WARN fs.FileSystem: "local" is a deprecated filesystem name. Use "file:///" instead. 
17/04/07 14:22:14 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 
17/04/07 14:22:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 
17/04/07 14:22:15 INFO tool.CodeGenTool: Beginning code generation 
17/04/07 14:22:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test2` AS t LIMIT 1 
17/04/07 14:22:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test2` AS t LIMIT 1 
17/04/07 14:22:15 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce 
Note: /tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.java uses or overrides a deprecated API. 
Note: Recompile with -Xlint:deprecation for details. 
17/04/07 14:22:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.jar 
17/04/07 14:22:17 INFO mapreduce.ExportJobBase: Beginning export of test2 
17/04/07 14:22:17 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= 
17/04/07 14:22:17 ERROR tool.ExportTool: Encountered IOException running export job: java.io.FileNotFoundException: File file:/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz does not exist 

該文件但可在我的本地機器:

/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz 

/data/hadoop/yarn/local/filecache/13/mapreduce.tar.gz 

任何人都知道問題是什麼?我只是按照本指南:

http://ingest.tips/2015/02/06/use-sqoop-transfer-csv-data-local-filesystem-relational-database/

+0

的'export'命令是精細的,問題是與該位置'/ HDP /應用/ 2.5.0.0-1245 /映射精簡/ mapreduce.tar.gz'。你必須從Sqoop所在的路徑中找到這個路徑,這是不正確的。 – franklinsijo

+0

是的,這是困難的部分,因爲我無法弄清楚如何追逐那個路徑變量。哪裏可以找到潛在的領域? –

+0

認爲我找到了它。 – franklinsijo

回答

1

酒店mapreduce.application.framework.path有這個值/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gzmapred-site.xml集。這是MapReduce框架歸檔的路徑,並指向HDFS中的文件。

這裏,Sqoop正在觸發-fs local此屬性需要設置一個LocalFS路徑。嘗試使用mapreduce存檔文件的本地路徑覆蓋此屬性值。

$ sqoop export -fs local -jt local -D 'mapreduce.application.framework.path=/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz' --connect jdbc:mysql://172.16.21.64:3306/cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username --password password --table test --export-dir file:///home/username/test.csv 
+0

謝謝。這工作。但另一個後續問題在這裏:http://stackoverflow.com/questions/43328725/sqoop-export-to-mysql-export-job-failed-tool-exporttool-but-got-records –