avfoundation - 如何使用 AVAssetWriter 将 h264 流写入视频?

标签 avfoundation h.264 avassetwriter

我想将 h.264 流从服务器制作成视频文件,但是当我使用 assetwrite.finishwrite 时,XCode 报告

Video /var/mobile/Applications/DE4196F1-BB77-4B7D-8C20-7A5D6223C64D/Documents/test.mov cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12847 "This movie format is not supported." UserInfo=0x5334830 {NSLocalizedDescription=This movie format is not supported.}"

下面是我的代码:
数据是 h.264 帧,只有一帧,可能是 i 帧或 p。
(void)_encodeVideoFrame2:(NSData *) data time:(double)tm 
{
  CMBlockBufferRef videoBlockBuffer=NULL;
  CMFormatDescriptionRef videoFormat=NULL;
  CMSampleBufferRef videoSampleBuffer=NULL;
  CMItemCount numberOfSampleTimeEntries=1;
  CMItemCount numberOfSamples=1;
  CMVideoFormatDescriptionCreate(kCFAllocatorDefault, kCMVideoCodecType_H264, 320, 240, NULL, &videoFormat);
  OSStatus result;
  result=CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, NULL, data.length, kCFAllocatorDefault, NULL, 0, data.length, kCMBlockBufferAssureMemoryNowFlag, &videoBlockBuffer);
  result=CMBlockBufferReplaceDataBytes(data.bytes, videoBlockBuffer, 0, data.length);
  CMSampleTimingInfo videoSampleTimingInformation={CMTimeMake(tm*600, 600)};
  size_t sampleSizeArray[1];
  sampleSizeArray[0]=data.length;
  result=CMSampleBufferCreate(kCFAllocatorDefault, videoBlockBuffer, TRUE, NULL, NULL, videoFormat, numberOfSamples, numberOfSampleTimeEntries, &videoSampleTimingInformation, 1, sampleSizeArray, &videoSampleBuffer);
  result = CMSampleBufferMakeDataReady(videoSampleBuffer);
  [assetWriterInput appendSampleBuffer:videoSampleBuffer]; 
}

也许是 CMSampleBufferCreate参数有误?谢谢。

最佳答案

试试这个代码

  • (IBAction)createVideo:(id)sender {

    ///////////// setup OR function def if we move this to a separate function //////////// // this should be moved to its own function, that can take an imageArray, videoOutputPath, etc... // - (void)exportImages:(NSMutableArray *)imageArray // asVideoToPath:(NSString *)videoOutputPath // withFrameSize:(CGSize)imageSize // framesPerSecond:(NSUInteger)fps {

    NSError *error = nil;

    // set up file manager, and file videoOutputPath, remove "test_output.mp4" if it exists... //NSString *videoOutputPath = @"/Users/someuser/Desktop/test_output.mp4"; NSFileManager *fileMgr = [NSFileManager defaultManager]; NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"]; NSString *videoOutputPath = [documentsDirectory stringByAppendingPathComponent:@"test_output.mp4"]; //NSLog(@"-->videoOutputPath= %@", videoOutputPath); // get rid of existing mp4 if exists... if ([fileMgr removeItemAtPath:videoOutputPath error:&error] != YES) NSLog(@"Unable to delete file: %@", [error localizedDescription]);

    CGSize imageSize = CGSizeMake(400, 200); NSUInteger fps = 30;

    //NSMutableArray *imageArray; //imageArray = [[NSMutableArray alloc] initWithObjects:@"download.jpeg", @"download2.jpeg", nil]; NSMutableArray imageArray; NSArray imagePaths = [[NSBundle mainBundle] pathsForResourcesOfType:@"jpg" inDirectory:nil]; imageArray = [[NSMutableArray alloc] initWithCapacity:imagePaths.count]; NSLog(@"-->imageArray.count= %i", imageArray.count); for (NSString* path in imagePaths) { [imageArray addObject:[UIImage imageWithContentsOfFile:path]]; //NSLog(@"-->image path= %@", path); }

    ////////////// end setup ///////////////////////////////////

    NSLog(@"Start building video from defined frames.");

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: [NSURL fileURLWithPath:videoOutputPath] fileType:AVFileTypeQuickTimeMovie error:&error]; NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:imageSize.width], AVVideoWidthKey, [NSNumber numberWithInt:imageSize.height], AVVideoHeightKey, nil];

    AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:nil];

    NSParameterAssert(videoWriterInput); NSParameterAssert([videoWriter canAddInput:videoWriterInput]); videoWriterInput.expectsMediaDataInRealTime = YES; [videoWriter addInput:videoWriterInput];

    //Start a session: [videoWriter startWriting]; [videoWriter startSessionAtSourceTime:kCMTimeZero];

    CVPixelBufferRef buffer = NULL;

    //convert uiimage to CGImage. int frameCount = 0; double numberOfSecondsPerFrame = 6; double frameDuration = fps * numberOfSecondsPerFrame;

    //for(VideoFrame * frm in imageArray) NSLog(@"****************************"); for(UIImage * img in imageArray) { //UIImage * img = frm._imageFrame; buffer = [self pixelBufferFromCGImage:[img CGImage]];

    BOOL append_ok = NO;
    int j = 0;
    while (!append_ok && j < 30) {
        if (adaptor.assetWriterInput.readyForMoreMediaData)  {
            //print out status:
            NSLog(@"Processing video frame (%d,%d)",frameCount,[imageArray count]);
    
            CMTime frameTime = CMTimeMake(frameCount*frameDuration,(int32_t) fps);
            append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
            if(!append_ok){
                NSError *error = videoWriter.error;
                if(error!=nil) {
                    NSLog(@"Unresolved error %@,%@.", error, [error userInfo]);
                }
            }
        }
        else {
            printf("adaptor not ready %d, %d\n", frameCount, j);
            [NSThread sleepForTimeInterval:0.1];
        }
        j++;
    }
    if (!append_ok) {
        printf("error appending image %d times %d\n, with error.", frameCount, j);
    }
    frameCount++;
    

    }
    NSLog(@" **************************** ");

    //结束 session :
    [videoWriterInput markAsFinished];
    [videoWriter写完];
    NSLog(@"写入结束");

    ///////////////////////////////////////////////////////////////////////////
    ////////////现在确定添加音频文件以移动文件//////////////////////
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    NSString *bundleDirectory = [[NSBundle mainBundle] bundlePath];
    //音频输入文件...
    NSString *audio_inputFilePath = [bundleDirectory stringByAppendingPathComponent:@"30secs.mp3"];
    NSURL *audio_inputFileUrl = [NSURL fileURLWithPath:audio_inputFilePath];

    //这是上面刚刚写的视频文件,文件的完整路径在 --> videoOutputPath
    NSURL *video_inputFileUrl = [NSURL fileURLWithPath:videoOutputPath];

    //将最终的视频输出文件创建为 MOV 文件 - 可能需要是 MP4,但到目前为止还有效...
    NSString *outputFilePath = [documentsDirectory stringByAppendingPathComponent:@"final_video.mp4"];
    NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];

    if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
    [[NSFileManager defaultManager] removeItemAtPath:outputFilePath 错误:nil];

    CMTime nextClipStartTime = kCMTimeZero;

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset trackingWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    //nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset trackingWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    //AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
    __block AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];

    NSLog(@"支持文件类型= %@", [_assetExport supportedFileTypes]);
    _assetExport.outputFileType = @"com.apple.quicktime-movie";
    NSLog(@"支持文件类型= %@", [_assetExport supportedFileTypes]);
    _assetExport.outputURL = outputFileUrl;

    [_assetExport exportAsynchronouslyWithCompletionHandler:^{
    开关(_assetExport.status){
    案例 AVAssetExportSessionStatusCompleted:
    //导入导出视频的自定义方法
    NSLog(@"完成!!!");
    休息;
    案例 AVAssetExportSessionStatusFailed:
    //
    NSLog(@"失败:%@",_assetExport.error);
    休息;
    案例 AVAssetExportSessionStatusCancelled:
    //
    NSLog(@"取消:%@",_assetExport.error);
    休息;
    默认:
    休息;
    }
    }];

    /////这样就完成了……最终的视频文件将写在这里……
    NSLog(@"DONE.....outputFilePath--->%@", outputFilePath);

    //最终的视频文件将位于如下位置:
    ///Users/caferrara/Library/Application Support/iPhone Simulator/6.0/Applications/D4B12FEE-E09C-4B12-B772-7F1BD6011BE1/Documents/outputFile.mov

  • }

关于avfoundation - 如何使用 AVAssetWriter 将 h264 流写入视频?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/13138385/

相关文章:

objective-c - Live Face Recognition iOS添加3D对象

flash - 如何检查H264/AVC ISO/IEC 14496-15 AVCDecoderConfigurationRecord?

ios - AVAssetWriter startSessionAtSourceTime 不接受 CMTIme 值

iphone - 使用 AVCaptureVideoDataOutput 和 AVCaptureAudioDataOutput 时的性能问题

iphone - 使用 AVMutableComposition 修剪视频

ios - 取 4 :3 photos with AVFoundation in Swift

performance - ffmpeg 在没有 avformat_find_stream_info 的情况下解码慢速调用

directshow - 从 RTP 数据包或 Rtsp 服务器获取视频宽度高度

objective-c - 从 CIImage 创建 CVPixelBufferRef 以写入文件

cocoa-touch - 如何使用 AVFoundation 或 AVAsset 在我的视频之上创建字幕?