java - ArrayIndexOutOfBoundsException 错误

标签 java hadoop

当我运行我的代码时,我从 reducer 的任务中得到了 ArrayIndexOutOfBoundsException 错误。

我的代码如下:

public void map(ImageHeader key, FloatImage value, Context context) throws IOException, InterruptedException{
    if (value != null) {
        mapcounter++;
        FloatImage gray=new FloatImage(value.getWidth(),value.getHeight(),value.getBands());

        int imageWidth = value.getWidth();
        int imageHeight = value.getHeight();

        for (int x = 0; x < imageWidth-1; x++) {

            for (int y = 0; y < imageHeight-1; y++) {
                float red =value.getPixel(x, y, 0);
                float green =value.getPixel(x, y, 1);
                float blue =value.getPixel(x, y, 2);
                //average of RGB
                float avg = (red + blue + green)/3;

                //set R, G & B with avg color
                gray.setPixel(x, y, 0, avg);
                gray.setPixel(x, y, 1, avg);
                gray.setPixel(x, y, 2, avg);
            }
        }

        ImageEncoder encoder = JPEGImageUtil.getInstance();

        FSDataOutputStream os = fileSystem.create(outpath);
        encoder.encodeImage(gray, key, os);
        os.flush();
        os.close();

        context.write(new BooleanWritable(true), new LongWritable(1));
    }
    else
        context.write(new BooleanWritable(false), new LongWritable(0));
}
public static class MyReducer extends Reducer<BooleanWritable, LongWritable, BooleanWritable, LongWritable> {

    public void reduce(BooleanWritable key, Iterable<LongWritable> values, Context context)
        throws IOException, InterruptedException
        {
            System.out.println("REDUCING");
            for (LongWritable temp_hash : values)
            {
                   context.write(new BooleanWritable(true), new LongWritable(1));
            }//for
        }
}

错误如下:

...
12/12/30 09:06:01 INFO mapred.JobClient:  map 100% reduce 33%
12/12/30 09:06:03 INFO mapred.JobClient: Task Id : attempt_201212271308_0005_r_000000_0, Status : FAILED
java.lang.ArrayIndexOutOfBoundsException: 1
        at org.apache.hadoop.io.WritableComparator.readInt(WritableComparator.java:153)
        at org.apache.hadoop.io.BooleanWritable$Comparator.compare(BooleanWritable.java:103)
        at org.apache.hadoop.mapreduce.ReduceContext.nextKeyValue(ReduceContext.java:120)
        at org.apache.hadoop.mapreduce.ReduceContext.nextKey(ReduceContext.java:92)
        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:175)
        at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:566)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:408)
        at org.apache.hadoop.mapred.Child.main(Child.java:170)

我该如何解决这个问题?

第二个问题:如何在我的程序中忽略reduce阶段而不运行reduce阶段?

最佳答案

看起来这是 Apache-hoop 中的一个错误。他们声称他们已经修复如下。

It was fixed as part of MAPREDUCE-365.

可以引用here吗有关错误的详细信息。

关于java - ArrayIndexOutOfBoundsException 错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/14088951/

相关文章:

java - Struts 错误 - 找不到包

java.lang.NoClassDefFoundError : Could not initialize class org. apache.hadoop.hbase.shaded.protobuf.ProtobufUtil$ClassLoaderHolder 错误

java - 用于 hadoop mapreduce 的 jar

mysql - sqoop如何处理HDFS中记录的删除/更新

hadoop - 如何将数据附加到存储在HDFS中的文件

java - 如何使用基于gradle的项目生成附带失败的屏幕截图和cumul.json文件的 cucumber html报告

java - Spring登录显示用户信息

java - 运行maven时如何避免此错误 No such file or directory

Java 错误 : cannot cast to java. applet.Eclipse 中的 Applet

hadoop - Hbase org.apache.hadoop.hbase.PleaseHoldException