ios - 使用 AVAssetWriter 录制视频

标签 ios iphone avassetwriter

我正在尝试使用 AVAssetwriter 录制视频,但在添加数据之前检查 AVAssetWriterInput 属性 readyForMoreMediaData 时,我总是收到 NO。

我看到相关帖子在尝试录制音频+视频时提到了类似的问题,所以我把音频录制部分拿掉了,但问题仍然存在(readyForMoreMediaData 总是NO)。

我的代码:

- (void)startRecordingWithAssetWriter {
    NSLog(@"Setting up capture session");
    captureSession = [[AVCaptureSession alloc] init];

    //----- ADD INPUTS -----
    NSLog(@"Adding video input");

    //ADD VIDEO INPUT
    AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    if (videoCaptureDevice) {
        NSError *error;
        videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
        if (!error) {
            if ([captureSession canAddInput:videoInputDevice]) {

                [captureSession addInput:videoInputDevice];
            } else {
                NSLog(@"Couldn't add video input");
            }
        } else {
            NSLog(@"Couldn't create video input");
        }
    } else {
        NSLog(@"Couldn't create video capture device");
    }

    //ADD AUDIO INPUT
    NSLog(@"Adding audio input");
    AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    NSError *error = nil;
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
    NSLog(@"Added audio input: %@", error.description);
    if (audioInput) {
        [captureSession addInput:audioInput];
    }


    //----- ADD OUTPUTS -----

    captureQueue = dispatch_queue_create("com.recordingtest", DISPATCH_QUEUE_SERIAL);


    //-- Create the output for the capture session.
    videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    [videoOutput setAlwaysDiscardsLateVideoFrames:YES];

    [videoOutput setVideoSettings:
     [NSDictionary dictionaryWithObject:
      [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]
                                 forKey:(id)kCVPixelBufferPixelFormatTypeKey]];

    [videoOutput setSampleBufferDelegate:self queue:captureQueue];

    if ([captureSession canAddOutput:videoOutput]) {
        NSLog(@"Added video Output");
        [captureSession addOutput:videoOutput];
    }

//    audioOutput = [[AVCaptureAudioDataOutput alloc] init];
//    [audioOutput setSampleBufferDelegate:self queue:captureQueue];
//    
//    if ([captureSession canAddOutput:audioOutput]) {
//        NSLog(@"Added audio Output");
//        [captureSession addOutput:audioOutput];
//    }

    //Create temporary URL to record to
    NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mp4"];
    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
    NSFileManager *fileManager = [NSFileManager defaultManager];
    if ([fileManager fileExistsAtPath:outputPath])
    {
        NSError *error;
        if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
        {
            //Error - handle if requried
        }
    }
    NSError *assetWriterError;
    assetWriter = [AVAssetWriter assetWriterWithURL:outputURL fileType:AVFileTypeMPEG4 error:&assetWriterError];
    if (assetWriterError) {

        NSLog(@"Error Setting assetWriter: %@", assetWriterError);
    }
    if (assetWriter != nil) {

    } else {

        NSLog(@"Error Setting assetWriter: %@", assetWriterError);
    }
    assetWriterVideoIn = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:nil];
    assetWriterVideoIn.expectsMediaDataInRealTime = YES;
//    assetWriterAudioIn = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:nil];
//    assetWriterAudioIn.expectsMediaDataInRealTime = YES;
    isRecording = YES;


    if ([assetWriter canAddInput:assetWriterVideoIn]) {
        [assetWriter addInput:assetWriterVideoIn];
    }
//    if ([assetWriter canAddInput:assetWriterAudioIn]) {
//        [assetWriter addInput:assetWriterAudioIn];
//    }



    [captureSession commitConfiguration];

    [captureSession startRunning];

}



- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {
    NSLog(@"didOutputSampleBuffer");


    CFRetain(sampleBuffer);

    dispatch_async(captureQueue, ^{

        if (assetWriter) {

            if (isRecording) {
                if (captureOutput == videoOutput) {
                    [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeVideo];
                }
//                else if (captureOutput == audioOutput) {
//                    [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeAudio];
//                }
            }


        }

        CFRelease(sampleBuffer);
    });

}

- (void)writeSampleBuffer:(CMSampleBufferRef)sampleBuffer ofType:(NSString *)mediaType {
    NSLog(@"writeSampleBuffer: %ld", (long) assetWriter.status);
    CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

    if (assetWriter.status == AVAssetWriterStatusUnknown) {

        if ([assetWriter startWriting]) {
            NSLog(@"startSessionAtSourceTime");
            [assetWriter startSessionAtSourceTime:presentationTime];
        } else {
            NSLog(@"Error writing initial buffer");
        }
    }

    if (assetWriter.status == AVAssetWriterStatusWriting) {

        if (mediaType == AVMediaTypeVideo) {
            NSLog(@"assetWriterVideoIn.readyForMoreMediaData: %d", assetWriterVideoIn.readyForMoreMediaData);

            if (assetWriterVideoIn.readyForMoreMediaData) {
                NSLog(@"appendSampleBuffer");

                if (![assetWriterVideoIn appendSampleBuffer:sampleBuffer]) {
                    NSLog(@"Error writing video buffer");
                }
            }
        }
//        else if (mediaType == AVMediaTypeAudio) {
//            if (assetWriterAudioIn.readyForMoreMediaData) {
//
//                if (![assetWriterAudioIn appendSampleBuffer:sampleBuffer]) {
//                    NSLog(@"Error writing audio buffer");
//                }
//            }
//        }
    }
}

最佳答案

在我设置 assetWriterVideoIn = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:nil]; 后它终于起作用了 获取一些实际设置而不是 nil。

改成这样:

NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                          [NSNumber numberWithInt:480], AVVideoWidthKey,[NSNumber numberWithInt:640], AVVideoHeightKey, nil];

assetWriterVideoIn = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:settings];

关于ios - 使用 AVAssetWriter 录制视频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36859134/

相关文章:

iphone - Apple API 歌曲 "previewUrl"持久性

ios - Apple Watch 圆形进度(或径向弧形进度),图像错误

php - session ID 与 token

html - 电子邮件不以 iphone 5 和 6 为中心

ios - 自动布局无法按预期布局按钮?

iphone - 使用 AVCaptureVideoDataOutput 和 AVCaptureAudioDataOutput 时的性能问题

ios - CMSampleBufferRef 始终具有相同的视频分辨率?

avfoundation - 使用 AVFoundation 在视频上绘制自定义时间戳文本的最佳方法是什么?

ios - NSMutableArray 初始化行为

ios - 动态创建的按钮和 Storyboardsegue