我想執行一個豬語句,顯示我在txt文件中的數據,我正在mapreduce模式下運行,但我收到一個錯誤,請有人幫我解決這個問題! !錯誤當試圖執行豬陳述
[[email protected] ~]# pig -x mapreduce
17/04/19 17:42:34 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
17/04/19 17:42:34 INFO pig.ExecTypeProvider: Trying ExecType : MAPREDUCE
17/04/19 17:42:34 INFO pig.ExecTypeProvider: Picked MAPREDUCE as the ExecType
2017-04-19 17:42:34,853 [main] INFO org.apache.pig.Main - Apache Pig version 0.16.0 (r1746530) compiled Jun 01 2016, 23:10:49
2017-04-19 17:42:34,853 [main] INFO org.apache.pig.Main - Logging error messages to: /root/pig_1492603954851.log
2017-04-19 17:42:34,907 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /root/.pigbootup not found
2017-04-19 17:42:36,060 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://localhost
2017-04-19 17:42:37,130 [main] INFO org.apache.pig.PigServer - Pig Script ID for the session: PIG-default-f60d05c3-9fee-4624-9aa8-07f1584e6165
2017-04-19 17:42:37,130 [main] WARN org.apache.pig.PigServer - ATS is disabled since yarn.timeline-service.enabled set to false
grunt> dump b;
2017-04-19 17:42:41,135 [main] ERROR org.apache.pig.tools.grunt.Grunt - You don't have permission to perform the operation. Error from the server: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=EXECUTE, inode="/tmp/temp1549818457":dead:supergroup:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1720)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1704)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1692)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:60)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3894)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:983)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
2017-04-19 17:42:41,136 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias b
Details at logfile: /root/pig_1492603954851.log
你可以檢查: - http://stackoverflow.com/questions/7194069/apache-pig-permissions-issue –
當我改變了/ tmp目錄permisions來向所有人那麼它給我的這些錯誤: - 輸入(S): 無法讀取從「/ TEMP」數據 輸出(S): 未能產生結果在「hdfs:// localhost/tmp/temp1691370991/tmp-1112412323」中 計數器: 記錄總數wri tten:0 總字節寫:0 濺灑內存管理器溢出次數:0 總包主動瀉:0 記錄合計主動瀉:0 工作DAG: 空 org.apache.pig.tools.grunt.Grunt - 錯誤1066:無法打開別名的迭代器b –
檢查您是否具有從文件夾讀取文件的正確訪問權限。如果不是,則還提供對HDFS文件夾的訪問權限。 –