2016-12-27 78 views
0

運行:[cloudera @ quickstart〜] $ sqoop export --connect「jdbc:mysql://quickstart.cloudera:3306/retail_db」--username retail_dba --password cloudera - -table department_export --export-DIR /家庭/ Cloudera公司/ sqoop_import /部門-m 12在運行sqoop導出時需要hdfs中的附加塊

錯誤:

16/12/24 22:29:48 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 16/12/24 22:29:49 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482646432089_0001 16/12/24 22:29:49 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot delete /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482646432089_0001. Name node is in safe mode. The reported blocks 1268 needs additional 39 blocks to reach the threshold 0.9990 of total blocks 1308. The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1446) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:4072) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:4030)

使用 「HDFS dfsadmin -safemode離開」 試過了,再次得到錯誤,

16/12/24 10:37:59 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 16/12/24 10:38:00 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482602419946_0007 16/12/24 10:38:00 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot delete /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482602419946_0007. Name node is in safe mode. It was turned on manually. Use "hdfs dfsadmin -safemode leave" to turn safe mode off. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.

回答

0

製作確保您已爲Sqoop運行時正確設置了HCAT_HOME環境變量。你得到的錯誤是因爲sqoop無法找到所需的依賴項「org.apache.hive.hcatalog *」,它在hcat的hcatalog中可用。