ios - 更改 AVCaptureDeviceInput 导致 AVAssetWriterStatusFailed

标签 ios objective-c avfoundation avcapturesession avassetwriter

我正在尝试更改相机 View FrontBack。它运行良好。如果录制视频时没有使用 Pause/Record 翻转> 选项工作正常。但是如果我们Flip Camera View 一次,则不会保存进一步录制的视频,这会导致 AVAssetWriterStatusFailed-操作无法完成。谁能帮我找到我哪里出错了?下面是我的代码。

相机.m

- (void)flipCamera{
NSArray * inputs = _session.inputs;
for ( AVCaptureDeviceInput * INPUT in inputs ) {
    AVCaptureDevice * Device = INPUT.device ;
    if ( [ Device hasMediaType : AVMediaTypeVideo ] ) {
        AVCaptureDevicePosition position = Device . position ; AVCaptureDevice * newCamera = nil ; AVCaptureDeviceInput * newInput = nil ;
        if ( position == AVCaptureDevicePositionFront )
            newCamera = [ self cameraWithPosition : AVCaptureDevicePositionBack ] ;
        else
            newCamera = [ self cameraWithPosition : AVCaptureDevicePositionFront ] ; newInput = [ AVCaptureDeviceInput deviceInputWithDevice : newCamera error : nil ] ;
        // beginConfiguration ensures that pending changes are not applied immediately
        [ _session beginConfiguration ] ;
        [ _session removeInput : INPUT ] ;
        [ _session addInput : newInput ] ;
        // Changes take effect once the outermost commitConfiguration is invoked.
        [ _session commitConfiguration ] ;
        break ;
    }
}
for ( AVCaptureDeviceInput * INPUT in inputs ) {
    AVCaptureDevice * Device = INPUT.device ;
    if ( [ Device hasMediaType : AVMediaTypeAudio ] ) {
        // audio input from default mic
        AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
        AVCaptureDeviceInput* newInput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
        //            [_session addInput:micinput];
        // beginConfiguration ensures that pending changes are not applied immediately
        [ _session beginConfiguration ] ;
        [ _session removeInput : INPUT ] ;
        [ _session addInput : newInput ] ;
        // Changes take effect once the outermost commitConfiguration is invoked.
        [ _session commitConfiguration ] ;
        break ;
    }
}
}

- ( AVCaptureDevice * ) cameraWithPosition : ( AVCaptureDevicePosition ) position
 {
NSArray * Devices = [ AVCaptureDevice devicesWithMediaType : AVMediaTypeVideo ] ;
for ( AVCaptureDevice * Device in Devices )
    if ( Device . position == position )
        return Device ;
return nil ;
}

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL bVideo = YES;
@synchronized(self)
{
    if (!self.isCapturing  || self.isPaused)
    {
        return;
    }
    if (connection != _videoConnection)
    {
        bVideo = NO;
    }
    if ((_encoder == nil) && !bVideo)
    {
        CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer);
        [self setAudioFormat:fmt];
        NSString* filename = [NSString stringWithFormat:@"capture%d.mp4", _currentFile];
        NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
        _encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate];
    }
    if (_discont)
    {
        if (bVideo)
        {
            return;
        }
        _discont = NO;
        // calc adjustment
        CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        CMTime last = bVideo ? _lastVideo : _lastAudio;
        if (last.flags & kCMTimeFlags_Valid)
        {
            if (_timeOffset.flags & kCMTimeFlags_Valid)
            {
                pts = CMTimeSubtract(pts, _timeOffset);
            }
            CMTime offset = CMTimeSubtract(pts, last);
            NSLog(@"Setting offset from %s", bVideo?"video": "audio");
            NSLog(@"Adding %f to %f (pts %f)", ((double)offset.value)/offset.timescale, ((double)_timeOffset.value)/_timeOffset.timescale, ((double)pts.value/pts.timescale));
            // this stops us having to set a scale for _timeOffset before we see the first video time
            if (_timeOffset.value == 0)
            {
                _timeOffset = offset;
            }
            else
            {
                _timeOffset = CMTimeAdd(_timeOffset, offset);
            }
        }
        _lastVideo.flags = 0;
        _lastAudio.flags = 0;
    }
    // retain so that we can release either this or modified one
    CFRetain(sampleBuffer);
    if (_timeOffset.value > 0)
    {
        CFRelease(sampleBuffer);
        sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
    }
    // record most recent time so we know the length of the pause
    CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    CMTime dur = CMSampleBufferGetDuration(sampleBuffer);
    if (dur.value > 0)
    {
        pts = CMTimeAdd(pts, dur);
    }
    if (bVideo)
    {
        _lastVideo = pts;
    }
    else
    {
        _lastAudio = pts;
    }
}
// pass frame to encoder
[_encoder encodeFrame:sampleBuffer isVideo:bVideo];
CFRelease(sampleBuffer);
}

编码器.m

 - (BOOL) encodeFrame:(CMSampleBufferRef) sampleBuffer isVideo:(BOOL)bVideo
  {
 if (CMSampleBufferDataIsReady(sampleBuffer))
{
    if (_writer.status == AVAssetWriterStatusUnknown)
    {
        CMTime startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        [_writer startWriting];
        [_writer startSessionAtSourceTime:startTime];
    }
    if (_writer.status == AVAssetWriterStatusFailed)
    {   // If Camera View is Flipped then Loop Enters inside this condition - writer error The operation could not be completed
        NSLog(@"writer error %@", _writer.error.localizedDescription);
        return NO;
    }
    if (bVideo)
    {
        if (_videoInput.readyForMoreMediaData == YES)
        {
            [_videoInput appendSampleBuffer:sampleBuffer];
            return YES;
        }
    }
    else
    {
        if (_audioInput.readyForMoreMediaData)
        {
            [_audioInput appendSampleBuffer:sampleBuffer];
            return YES;
        }
    }
}
return NO;
}

提前致谢。

最佳答案

问题是这一行:

if (connection != _videoConnection)
    {
        bVideo = NO;
    }

当您更改相机时,会创建一个新的 videoConnection,我不知道是在哪里创建的。但是,如果您像下面这样更改此行,它就会起作用:

//if (connection != _videoConnection)
if ([connection.output connectionWithMediaType:AVMediaTypeVideo] == nil)
    {
        bVideo = NO;
    }

关于ios - 更改 AVCaptureDeviceInput 导致 AVAssetWriterStatusFailed,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23805863/

相关文章:

ios - 0_specialized _fatalerrorMessage (StaticString, StaticString, StaticString, UInt, flags : UInt32) -> Never - SpriteKit

ios - Xamarin iOS |从 Windows 上的 Visual Studio 连接 Xamarin 构建主机时出错

objective-c - 从 Objective C 的类方法中引用类本身

ios - 哪些iphone条码sdk可以同时扫描多个一维条码

ios - TextView和TextField的单个扩展以添加工具栏

ios - XCTAssertEqualObjects 的替代品是什么?

objective-c - 什么是\? (反斜杠问号)转义序列是什么意思?

ios - 为逗号添加任何字母/数字的上标

iphone - 如何在iOS4上使用AVFoundation在电影中添加音频文件内容?

swift - 生成多个缩略图 Swift 3