hadoop - YARN Mapreduce作业中的AccessControlException

标签 hadoop mapreduce cluster-computing yarn cloudera-cdh

最近,我们已升级到CDH 5.1.3和YARN,我们在mapreduce作业中遇到以下错误

at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)                             [1829/1922]
        at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1140)
        at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1128)
        at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1118)
        at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:264)
        at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:231)
        at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:224)
        at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1291)
        at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:300)
        at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:296)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:296)
        at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
        at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.<init>(JobHistoryParser.java:86)
        at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:335)
        ... 22 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=mapred, ac
cess=READ, inode="/user/history/done_intermediate/abc/job_1412716537481_0426-1412782860181-abc-PigLatin%3ACategory+li
ft+for+pixels%3A9259-1412782882528-1-1-SUCCEEDED-root.abc-1412782867082.jhist":abc:supergroup:-rwxrwx---
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:185)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5607)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5589)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:5551)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1717)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1669)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1717)         [1804/1922]
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1669)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1649)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1621)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:482)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServe
rSideTranslatorPB.java:322)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenod
eProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio

任何解决此问题的方法

我们应该对/ user / *和/ user / history / *拥有什么权限

最佳答案

删除/ user / history / done_intermediate / *后得到解决

关于hadoop - YARN Mapreduce作业中的AccessControlException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/26282690/

相关文章:

performance - Hive 联合所有效率和最佳实践

python - Hadoop 和 Python : Disable Sorting

java - 自定义 RecordReader 初始化未调用

linux - SGE(Sun Grid Engine)设置和调整

azure - Azure 上的 MarkLogic 故障转移群集 - Azure Blob 上的林配置

hadoop - Storm-jms Spout收集Avro消息并向下游发送?

hadoop - 为什么 hadoop mapper 任务的持续时间总是 3 秒的倍数?

java - 错误 : cannot access org. apache.hadoop.mapred.MapReduceBase

hadoop - 在 Hadoop 中,如何将整个文件作为映射器的输入?

java - HazelcastClient 具有相同 HazelcastItance 的故障转移集群