ios - 如何使用 AVCaptureVideoDataOutput 录制视频

标签 ios swift video avfoundation avcaptureoutput

我正在使用 AVCaptureSession 获取相机输出并成功添加了音频和视频输入和输出。

{

    var captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) as AVCaptureDevice

    var error: NSError? = nil

    do {

        //remove the previous inputs
        let inputs = cameraSession.inputs as! [AVCaptureDeviceInput]
        for oldInput:AVCaptureDeviceInput in inputs {
            cameraSession.removeInput(oldInput)
        }
        cameraSession.beginConfiguration()

        if cameraPosition.isEqualToString("Front") {
            captureDevice = cameraWithPosition(.Front)!
        }
        else {
            captureDevice = cameraWithPosition(.Back)!
        }

        let deviceInput = try AVCaptureDeviceInput(device: captureDevice)

        if (cameraSession.canAddInput(deviceInput) == true) {
            cameraSession.addInput(deviceInput)
        }

        let dataOutput = AVCaptureVideoDataOutput()
        dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) : NSNumber(unsignedInt: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)]
        dataOutput.alwaysDiscardsLateVideoFrames = true

        if (cameraSession.canAddOutput(dataOutput) == true) {
            cameraSession.addOutput(dataOutput)
        }

        let audioCheck = AVCaptureDevice.devicesWithMediaType(AVMediaTypeAudio)
        if audioCheck.isEmpty {
            print("no audio device")
            return
        }

        let audioDevice: AVCaptureDevice! = audioCheck.first as! AVCaptureDevice

        var audioDeviceInput: AVCaptureDeviceInput?

        do {
            audioDeviceInput = try AVCaptureDeviceInput(device: audioDevice)
        } catch let error2 as NSError {
            error = error2
            audioDeviceInput = nil
        } catch {
            fatalError()
        }

        if error != nil{
            print(error)
            let alert = UIAlertController(title: "Error", message: error!.localizedDescription
                , preferredStyle: .Alert)
            alert.addAction(UIAlertAction(title: "OK", style: .Default, handler: nil))
            self.presentViewController(alert, animated: true, completion: nil)
        }
        if cameraSession.canAddInput(audioDeviceInput){
            cameraSession.addInput(audioDeviceInput)
        }

        cameraSession.commitConfiguration()

        let queue = dispatch_queue_create("com.invasivecode.videoQueue", DISPATCH_QUEUE_SERIAL)
        dataOutput.setSampleBufferDelegate(self, queue: queue)

    }
    catch let error as NSError {
        NSLog("\(error), \(error.localizedDescription)")
    }
}

使用 AVCaptureMovieFileOutput 可以使用

将输出视频保存在照片库中
movieFileOutput.startRecordingToOutputFileURL( outputFilePath, recordingDelegate: self)

但我使用 AVCaptureVideoDataOutput 作为输出,对我从代表那里获得的元数据进行额外的工作并尝试录制视频,但我无法获得任何开始和停止录制视频的方法.

建议如何使用 AVCaptureVideoDataOutput 录制视频

最佳答案

你需要一个 AVCaptureSession 来做到这一点:

//First add AVCaptureVideoDataOutput to AVCaptureSession 
AVCaptureSession *_captureSession;
_captureSession = [[AVCaptureSession alloc] init];
......Configuration......

AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
......Configuration......
if ( [_captureSession canAddOutput:videoOut] ) {
    [_captureSession addOutput:videoOut];
}

//Then use captureSession to start and stop recording
[_captureSession startRunning];
[_captureSession stopRunning];

请查看 RosyWriterCapturePipeline.m,这是一个很好的例子:

RosyWriter

关于ios - 如何使用 AVCaptureVideoDataOutput 录制视频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38977579/

相关文章:

java - 如何从 Swift 原生代码到 Flutter 的 'put' 参数

Swift - 存储 UIAlertController 文本值

android - android 上的 WebGL 视频纹理大小限制 - 如何使用 3840 * 2160?

javascript - 在我将自动播放的首选项更改为 'Allow all Autoplay' 之前,javascript 中的视频播放 () api 在 safari 中不起作用

ios - 一次 View Controller /segue

iphone - 应用程序运行时推送通知不起作用(事件)

ios - 如何在 iOS 键盘键上显示美国手语 (Hand Signs)?

java - 与处理中播放的动画相比,最终渲染的视频速度加快

ios - Autolayout - 如何与最长的项目对齐

ios - 第二个 bug 的 CLLocation 时间戳分数?