Hadoop 自定义输出 RecordWriter 错误

标签 hadoop video mapreduce xuggler

我正在使用 HVPI,一个开源 hadoop 接口(interface),使用 Hadoop 和 MapReduce(完全分布式模式)处理视频。我将视频分割成帧,我想使用这些帧使用 Xuggler API 制作一个新视频。

map 阶段正常完成,但 reduce 阶段导致 java.lang.RuntimeException: error Operation not allowed .
这是因为我正在尝试在主节点目录中制作新视频,而我真的不知道如何在 HDFS 上制作。

    17/03/25 08:07:12 INFO client.RMProxy: Connecting to ResourceManager at evoido/192.168.25.11:8032
17/03/25 08:07:13 INFO Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
17/03/25 08:07:13 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
17/03/25 08:29:50 INFO input.FileInputFormat: Total input paths to process : 1
17/03/25 08:29:51 INFO mapreduce.JobSubmitter: number of splits:1
17/03/25 08:29:51 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1490439401793_0001
17/03/25 08:29:52 INFO impl.YarnClientImpl: Submitted application application_1490439401793_0001
17/03/25 08:29:52 INFO mapreduce.Job: The url to track the job: http://evoido:8088/proxy/application_1490439401793_0001/
17/03/25 08:29:52 INFO mapreduce.Job: Running job: job_1490439401793_0001
17/03/25 08:30:28 INFO mapreduce.Job: Job job_1490439401793_0001 running in uber mode : false
17/03/25 08:30:28 INFO mapreduce.Job:  map 0% reduce 0%
17/03/25 08:30:52 INFO mapreduce.Job:  map 100% reduce 0%
17/03/25 08:30:52 INFO mapreduce.Job: Task Id : attempt_1490439401793_0001_m_000000_0, Status : FAILED
17/03/25 08:30:54 INFO mapreduce.Job:  map 0% reduce 0%
17/03/25 08:37:40 INFO mapreduce.Job:  map 68% reduce 0%
17/03/25 08:37:43 INFO mapreduce.Job:  map 69% reduce 0%
17/03/25 08:37:52 INFO mapreduce.Job:  map 73% reduce 0%
17/03/25 08:38:30 INFO mapreduce.Job:  map 82% reduce 0%
17/03/25 08:39:26 INFO mapreduce.Job:  map 100% reduce 0%
17/03/25 08:40:36 INFO mapreduce.Job:  map 100% reduce 67%
17/03/25 08:40:39 INFO mapreduce.Job: Task Id : attempt_1490439401793_0001_r_000000_0, Status : FAILED
Error: java.lang.RuntimeException: error Operação não permitida, failed to write trailer to /home/idobrt/Vídeos/Result/
        at com.xuggle.mediatool.MediaWriter.close(MediaWriter.java:1306)
        at ads.ifba.edu.tcc.util.MediaWriter.close(MediaWriter.java:97)
        at edu.bupt.videodatacenter.input.VideoRecordWriter.close(VideoRecordWriter.java:61)
        at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.close(ReduceTask.java:550)
        at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:629)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

这是我的 VideoRecordWriter 实现:
public class VideoRecordWriter extends RecordWriter<Text, ImageWritable>{

    private FileSystem fs;


    @Override   
    public void close(TaskAttemptContext job) throws IOException, InterruptedException {

        // TODO Auto-generated method stub
        Path outputPath = new Path(job.getConfiguration().get("mapred.output.dir"));
        Configuration conf = job.getConfiguration();
        fs = outputPath.getFileSystem(conf);

        MediaWriter.initialize().close();
        //fs.copyFromLocalFile(new Path(MediaWriter.initialize().getVideoPath()), outputPath);
        fs.close();

    }

    @Override
    public void write(Text key,ImageWritable img) throws IOException, InterruptedException {
        // TODO Auto-generated method stub
        //System.out.println("Key value: "+key.toString());

        MediaWriter.initialize().setDimentions(img.getBufferedImage());
        MediaWriter.initialize().creaVideoContainer();          
        MediaWriter.initialize().create(img.getBufferedImage());


    }




}


    public class MediaWriter{


        private MediaWriter(){

        }

        public static MediaWriter initialize() throws IOException{

            if(instance == null){
              instance = new MediaWriter();

              /*
              fs = FileSystem.get(new Configuration());
              outputStream = fs.create(new Path("hdfs://evoido:9000/video/teste.mp4"));
              containerFormat = IContainerFormat.make();
              containerFormat.setOutputFormat("mpeg4", null, "video/ogg");

              writer.getContainer().setFormat(containerFormat);
              writer = ToolFactory.makeWriter(XugglerIO.map(outputStream));
              */

            }
            return instance;
        }

        public void setDimentions(BufferedImage img){

            if((WIDTH==0)&&(HEIGHT==0)){
            WIDTH = img.getWidth();
            HEIGHT = img.getHeight();
            }
        }

        public void setFileName(Text key){

            if(fileName==null){
            fileName = key.toString();
            VIDEO_NAME += fileName.substring(0, (fileName.lastIndexOf("_")-4))+".mp4";
            }
        }

        public void creaVideoContainer() throws IOException{

            if(writer ==null){
            writer = ToolFactory.makeWriter(VIDEO_NAME);
              /*
                fs = FileSystem.get(new Configuration());
              outputStream = fs.create(new Path("hdfs://evoido:9000/video/teste.mp4"));
              containerFormat = IContainerFormat.make();
              containerFormat.setOutputFormat("mpeg4", null, "video/ogg");
              */
            writer.getContainer().setFormat(containerFormat);

            writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_MPEG4,WIDTH,HEIGHT);

            }
        }
        public void create(BufferedImage img) {
            // TODO Auto-generated method stub
            //precisamos descobrir como setar o timeStamp corretamente
            if(offset == 0){
                offset = calcTimeStamp();


            }


            writer.encodeVideo(0,img,timeStamp, TimeUnit.NANOSECONDS);
            timeStamp+=offset;


        }


        public void close() {
            // TODO Auto-generated method stub
            writer.close();
        }

        public String getVideoPath(){
            return VIDEO_NAME;
        }
        public void setTime(long interval){
            time+= interval;
        }


        public void setQtdFrame(long frameNum){
            qtdFrame = frameNum;
        }
        /*

         * */
        public long calcTimeStamp(){

            double interval = 0.0;
            double timeLong = Math.round(time/CONST);
            double result = (time/(double)qtdFrame)*1000.0;
            /*

            */
            if((timeLong > 3600)&&((time % qtdFrame)!=0)){
                interval = 1000.0;
                double overplus = timeLong/3600.0;
                if(overplus >=2 ){
                    interval*=overplus;
                }
                result+=interval;
            }

            return (long)Math.round(result);
        }

        public void setFramerate(double frameR){
            if(frameRate == 0){
                frameRate = frameR;
            }
        }


        private static IMediaWriter writer;
        private static long nextFrameTime = 0;  
        private static FileSystem fs;   
        private static OutputStream outputStream;
        private static MediaWriter instance;
        private static IContainerFormat containerFormat;
        private static String VIDEO_NAME = "/home/idobrt/Vídeos/Result/";
        private static int WIDTH =0;
        private static int HEIGHT= 0;
        private static String fileName = null;
        private static long timeStamp = 0;
        private static double time = 0;
        private static long qtdFrame = 0;
        private static long offset = 0;
        private static long startTime = 0;
        private static double frameRate = 0;
        private static double CONST = 1000000.0;
        private static double INTERVAL = 1000.0;
    }

问题只是 writer = ToolFactory.makeWriter(VIDEO_NAME);因为 VIDEO_NAME 是一个 Namenode 本地目录。
有人知道正确的方法吗?我想正确的方法是在 HDFS 上写入文件。如果作业在 jobLocalRunner 中运行它会起作用,但我会失去并行性。

最佳答案

现在我只是将文件保存在数据节点中(reduce fase 正在运行)并将这个文件复制到 HDFS。这不是最好的解决方案,但现在可以使用。

关于Hadoop 自定义输出 RecordWriter 错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/43016176/

相关文章:

javascript - 如何用JavaScript正确创建HTML5音视频

ios - 如何通过 RTMP 将视频流式传输到 iOS?

java - 如何在单个 MapReduce 中读取多种类型的 Avro 数据

hadoop - 如何在 Google Cloud Storage 中存储大量小的 HTML 文件以优化 Dataproc?

hadoop - pig : How to exclude first n lines while Loading

java - H264中的 header 帧号排序是什么意思?

python - AppEngine MapReduce 如何在使用数据存储输入读取器时过滤 StructuredProperty?

java - Hadoop Java 错误 : Exception in thread "main" java. lang.ClassNotFoundException : com. packt.ch3.etl.ParseWeblogs

hadoop - HBase错误: Server IPC version 8 cannot communicate with client version 4

hadoop - Hadoop Map-Reduce作业的吞吐量