java - Hadoop MapReduce NoSuchElementException

标签 java hadoop mapreduce hbase nosuchelementexception

我想在具有两个节点的FreeBSD-Cluster上运行MapReduce-Job,但是出现以下异常

14/08/27 14:23:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/08/27 14:23:04 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
14/08/27 14:23:04 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
14/08/27 14:23:04 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
14/08/27 14:23:04 WARN mapreduce.JobSubmitter: No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
14/08/27 14:23:04 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-otlam/mapred/staging/otlam968414084/.staging/job_local968414084_0001
Exception in thread "main" java.util.NoSuchElementException
at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
...

当我尝试在新的MapReduce作业上运行job.watForCompletion(true);时,会发生这种情况。应该抛出NoSuchElementException,因为在那里没有再调用StringTokenizer和next()中的任何元素。
我查看了源代码,并在RawLocalFileSystem.java中找到以下代码部分:
/// loads permissions, owner, and group from `ls -ld`
private void loadPermissionInfo() {
  IOException e = null;
  try {
    String output = FileUtil.execCommand(new File(getPath().toUri()), 
        Shell.getGetPermissionCommand());
    StringTokenizer t =
        new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
    //expected format
    //-rw-------    1 username groupname ...
    String permission = t.nextToken();

据我所知,Hadoop尝试使用ls -ld找出特定文件的某些权限,如果我在控制台中使用它,该文件就可以很好地工作。不幸的是,我还没有找到它正在寻找的文件权限。

Hadoop版本为2.4.1,HBase版本为0.98.4,我正在使用Java-API。其他操作(如创建表)也可以正常工作。是否有人遇到过类似的问题或知道该怎么办?

编辑:
我刚刚发现这是一个与hadoop相关的问题。即使不使用HDFS,进行最简单的MapReduce-Operation也会给我同样的异常(exception)。

最佳答案

您能检查一下是否可以解决您的问题。

如果您的权限问题,则可以使用。

public static void main(String[] args) {
     //set user group information       
     UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hdfs");
     //set privilege exception
     ugi.doAs(new PrivilegedExceptionAction<Void>() {
     public Void run() throws Exception {
                //create configuration object
                 Configuration config = new Configuration();
                 config.set("fs.defaultFS", "hdfs://ip:port/");
                 config.set("hadoop.job.ugi", "hdfs");
                 FileSystem dfs = FileSystem.get(config);
                 .
                 .

关于java - Hadoop MapReduce NoSuchElementException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/25364802/

相关文章:

java - 为HBase创建行自动递增协处理器

ubuntu - hadoop 启动错误 : datanode, tasktracker won't start and data replication error

java - 无法通过此 java 代码的输出找到答案

java - 外部化 DEFAULT 配置属性

java - Spark 失败,出现 java.lang.OutOfMemoryError : GC overhead limit exceeded?

hadoop - 写入Hbase表

java - 使用 elasticsearch-hadoop map-reduce 将 json 从 HDFS 写入 Elasticsearch

javascript - CouchDB:有条件地将 Map/Reduce 输出中的值更改为负值

java - 在 android 中使用 org.apache.commons.net.telnet.* 时出现问题

java - 通过组合嵌套列表和复制单个值来扩展列表