2016-05-04 76 views
0

堆棧:使用Ambari 2.1安裝的HDP-2.3.2.0-2950Sqoop的許可例外

安裝是自動的。因爲機器(總共9個節點)具有互聯網連接並且使用根證書完成。

ls命令輸出以供參考(sqoop用戶缺少):

[[email protected] ~]# hadoop fs -ls /user 
Found 7 items 
drwx------ - accumulo hdfs   0 2015-11-05 14:03 /user/accumulo 
drwxrwx--- - ambari-qa hdfs   0 2015-10-30 16:08 /user/ambari-qa 
drwxr-xr-x - hcat  hdfs   0 2015-10-30 16:17 /user/hcat 
drwxr-xr-x - hdfs  hdfs   0 2015-11-11 10:09 /user/hdfs 
drwx------ - hive  hdfs   0 2015-11-06 09:42 /user/hive 
drwxrwxr-x - oozie  hdfs   0 2015-11-05 12:53 /user/oozie 
drwxrwxr-x - spark  hdfs   0 2015-11-05 13:59 /user/spark 
[[email protected] ~]# 
[[email protected] ~]# 

另一個令人擔憂(sqoop用戶缺少)輸出時我想到用戶組:

cat /etc/group 
root:x:0: 
bin:x:1:bin,daemon 
daemon:x:2:bin,daemon 
sys:x:3:bin,adm 
adm:x:4:adm,daemon 
tty:x:5: 
disk:x:6: 
lp:x:7:daemon 
mem:x:8: 
kmem:x:9: 
wheel:x:10: 
mail:x:12:mail 
uucp:x:14: 
man:x:15: 
games:x:20: 
gopher:x:30: 
video:x:39: 
dip:x:40: 
ftp:x:50: 
lock:x:54: 
audio:x:63: 
nobody:x:99: 
users:x:100:oozie,ambari-qa,tez,falcon 
dbus:x:81: 
utmp:x:22: 
utempter:x:35: 
floppy:x:19: 
vcsa:x:69: 
cdrom:x:11: 
tape:x:33: 
dialout:x:18: 
haldaemon:x:68:haldaemon 
ntp:x:38: 
saslauth:x:76: 
mailnull:x:47: 
smmsp:x:51: 
stapusr:x:156: 
stapsys:x:157: 
stapdev:x:158: 
sshd:x:74: 
tcpdump:x:72: 
slocate:x:21: 
ovirtagent:x:175: 
rpc:x:32: 
rpcuser:x:29: 
nfsnobody:x:65534: 
munin:x:499: 
screen:x:84: 
scotty:x:999: 
tquest:x:6382: 
fuse:x:497: 
httpfs:x:496:httpfs 
knox:x:6383: 
spark:x:6384: 
hdfs:x:6385:hdfs 
accumulo:x:495: 
falcon:x:494: 
flume:x:493: 
hbase:x:492: 
hive:x:491: 
oozie:x:490: 
storm:x:489: 

在使用Sqoop(作爲'sqoop'Linux用戶)將數據從sql server導入HDFS時:

ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.security.AccessControlException: Permission denied: user=sqoop, access=WRITE, inode="/user/sqoop/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
     at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) 
     at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3010) 
     at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036) 
     at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133) 
     at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) 
     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) 
     at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) 
     at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) 
     at org.apache.sqoop.manager.SQLServerManager.importTable(SQLServerManager.java:163) 
     at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) 
     at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) 
     at org.apache.sqoop.Sqoop.run(Sqoop.java:148) 
     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 
     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) 
     at org.apache.sqoop.Sqoop.main(Sqoop.java:244) 
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=sqoop, access=WRITE, inode="/user/sqoop/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1427) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1358) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) 
     at com.sun.proxy.$Proxy14.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:558) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
     at com.sun.proxy.$Proxy15.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008) 
     ... 28 more 

在導入從SQL Server表到HDFS使用Sqoop(爲 '根' 的Linux用戶):

ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user/root/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
     at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) 
     at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3010) 
     at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036) 
     at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133) 
     at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) 
     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) 
     at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) 
     at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) 
     at org.apache.sqoop.manager.SQLServerManager.importTable(SQLServerManager.java:163) 
     at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) 
     at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) 
     at org.apache.sqoop.Sqoop.run(Sqoop.java:148) 
     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 
     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) 
     at org.apache.sqoop.Sqoop.main(Sqoop.java:244) 
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/user/root/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1427) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1358) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) 
     at com.sun.proxy.$Proxy14.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:558) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
     at com.sun.proxy.$Proxy15.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008) 
     ... 28 more 

當我導入從SQL Server表使用Sqoop HDFS到(如在「HDFS」的Linux用戶),它的工作原理,但有一個錯誤日誌語句

INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950 
16/05/04 16:34:13 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 
16/05/04 16:34:14 INFO manager.SqlManager: Using default fetchSize of 1000 
16/05/04 16:34:14 INFO tool.CodeGenTool: Beginning code generation 
16/05/04 16:34:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [DimSampleDesc] AS t WHERE 1=0 
16/05/04 16:34:15 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.3.2.0-2950/hadoop-mapreduce 
Note: /tmp/sqoop-hdfs/compile/6f239d67662b5e2a3462b51268033d6e/DimSampleDesc.java uses or overrides a deprecated API. 
Note: Recompile with -Xlint:deprecation for details. 
16/05/04 16:34:17 ERROR orm.CompilationManager: Could not make directory: /root/. 

我有以下問題:

  1. 爲什麼儘管有汽車的錯誤。安裝即我沒有跳過任何服務/配置
  2. 什麼是執行Sqoop進口或MR(我的意思是應該使用哪些相應的用戶)

回答

0

你必須創建一個家庭用戶的理想方式(對HDFS)對誰啓動命令的用戶

當您啓動sqoop命令,Hadoop的將映射與HDFS用戶的本地用戶,試圖找到家是/user/${USER.NAME}

由於看起來Hadoop超級用戶是hdfs所以你需要這樣做:

$ su - hdfs 'hadoop fs -mkdir /user/sqoop' 
$ su - hdfs 'hadoop fs -chown sqoop:hdfs /user/sqoop ' 

然後啓動sqoop作爲用戶sqoop

另一種選擇,就是改變蜂房」臨時目錄一些其他HDFS位置,其中所有用戶都具有(如/ TMP)

在寫訪問hive-site.xml

<property> 
    <name>hive.exec.stagingdir</name> 
    <value>/tmp</value> 
</property> 
+0

我編輯了我的問題,你能檢查並幫助我理解嗎? –

+0

是的!當你以'hdfs'用戶啓動命令時,hadoop(或hdfs)將以hdfs執行該命令,並且由於用戶'hdfs'已經在hdfs上擁有它的主頁,所以請求將會通過...但是對於用戶sqoop,它試圖使用home/user/sqoop,它找不到它,所以試圖創建它,並發現sqoop沒有滲透性寫在/ user(擁有超級用戶'hdfs'),所以錯誤發生.... 。希望這會有所幫助 – user1314742

+0

這工作,現在我有幾個疑惑和一個問題,但會發布另一個問題, –