ios - 使用 AVCaptureSession 录制视频,添加 CIFilter 并保存到相册

标签 ios swift video swift2 video-capture

我想在我的应用程序中制作自定义录像机。 现在我可以录制视频并保存它,但我想在录制时为视频添加过滤器并将带有新过滤器的视频保存到相册。这是我录制视频并保存的代码。

let captureSession = AVCaptureSession()
let fileOutput = AVCaptureMovieFileOutput()

func initVideoRecording() {



   do {
        try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryRecord)
        try AVAudioSession.sharedInstance().setActive(true)
    }catch {
        print("error in audio")
    }

    let session = AVCaptureSession()

    session.beginConfiguration()

    session.sessionPreset = AVCaptureSessionPresetMedium

    let videoLayer = AVCaptureVideoPreviewLayer(session: session)
    videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
    videoLayer.frame = myImage.bounds
    myImage.layer.addSublayer(videoLayer)

    let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    let audio = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)
    do
    {
        let input = try AVCaptureDeviceInput(device: backCamera)
        let audioInput = try AVCaptureDeviceInput(device: audio)

        session.addInput(input)
        session.addInput(audioInput)

    }
    catch
    {
        print("can't access camera")
        return
    }

    session.addOutput(fileOutput)

    session.commitConfiguration()

    session.startRunning()

}

@IBAction func recordFunc() {
        if fileOutput.recording {
            myButton.setTitle("record", forState: .Normal)
            fileOutput.stopRecording()
        }else{
            let fileUrl = NSURL(fileURLWithPath: NSTemporaryDirectory()).URLByAppendingPathComponent("\(getCurrentDate())-capturedvideo.mp4")
        fileOutput.startRecordingToOutputFileURL(fileUrl, recordingDelegate: self)

        myButton.setTitle("stop", forState: .Normal)

    }
}

func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!) {

//to save record video to photos album

    UISaveVideoAtPathToSavedPhotosAlbum(outputFileURL.path!, self, "video:didFinishSavingWithError:contextInfo:", nil)


}

我尝试使用 AVCaptureVideoDataOutput

在它的委托(delegate)中我使用了这段代码

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {


            connection.videoOrientation = AVCaptureVideoOrientation.Portrait
            let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
            let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)

            let comicEffect = CIFilter(name: "CIComicEffect")

            comicEffect!.setValue(cameraImage, forKey: kCIInputImageKey)

            let filteredImage = UIImage(CIImage: comicEffect!.valueForKey(kCIOutputImageKey) as! CIImage!)


            dispatch_async(dispatch_get_main_queue())
            {
                self.myImage.image = filteredImage

            }

        }

使用这段代码,它只显示过滤器而不记录它。

=======================/这是我的问题的解决方案\============ ==== 请注意这段代码使用的不是 swift 2 和 Xcode 7.3

let captureSession = AVCaptureSession()
    let videoOutput = AVCaptureVideoDataOutput()
    let audioOutput = AVCaptureAudioDataOutput()

    var adapter:AVAssetWriterInputPixelBufferAdaptor!
    var record = false
    var videoWriter:AVAssetWriter!
    var writerInput:AVAssetWriterInput!
    var audioWriterInput:AVAssetWriterInput!
    var lastPath = ""
    var starTime = kCMTimeZero

    var outputSize = CGSizeMake(UIScreen.mainScreen().bounds.width, UIScreen.mainScreen().bounds.height)

override func viewDidAppear(animated: Bool) {
        super.viewDidAppear(animated)

        video()
    }

    func video() {

        do {
            try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryRecord)
            try AVAudioSession.sharedInstance().setActive(true)
        }catch {
            print("error in audio")
        }

        captureSession.beginConfiguration()

        captureSession.sessionPreset = AVCaptureSessionPresetMedium

        let videoLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
        //videoLayer.frame = myImage.bounds
        //myImage.layer.addSublayer(videoLayer)

        view.layer.addSublayer(videoLayer)

        let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
        let audio = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)
        do
        {
            let input = try AVCaptureDeviceInput(device: backCamera)
            let audioInput = try AVCaptureDeviceInput(device: audio)

            captureSession.addInput(input)
            captureSession.addInput(audioInput)

        }
        catch
        {
            print("can't access camera")
            return
        }

        let queue = dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL)

        videoOutput.setSampleBufferDelegate(self,queue: queue)
        audioOutput.setSampleBufferDelegate(self, queue: queue)

        captureSession.addOutput(videoOutput)
        captureSession.addOutput(audioOutput)
        captureSession.commitConfiguration()

        captureSession.startRunning()

    }


    @IBAction func recordFunc() {

        if record {
            myButton.setTitle("record", forState: .Normal)
            record = false
            self.writerInput.markAsFinished()
            audioWriterInput.markAsFinished()
            self.videoWriter.finishWritingWithCompletionHandler { () -> Void in
                print("FINISHED!!!!!")
                UISaveVideoAtPathToSavedPhotosAlbum(self.lastPath, self, "video:didFinishSavingWithError:contextInfo:", nil)
            }


        }else{

            let fileUrl = NSURL(fileURLWithPath: NSTemporaryDirectory()).URLByAppendingPathComponent("\(getCurrentDate())-capturedvideo.MP4")

            lastPath = fileUrl.path!
            videoWriter = try? AVAssetWriter(URL: fileUrl, fileType: AVFileTypeMPEG4)



            let outputSettings = [AVVideoCodecKey : AVVideoCodecH264, AVVideoWidthKey : NSNumber(float: Float(outputSize.width)), AVVideoHeightKey : NSNumber(float: Float(outputSize.height))]

            writerInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
            writerInput.expectsMediaDataInRealTime = true
            audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: DejalActivityView.getAudioDictionary() as? [String:AnyObject])

            videoWriter.addInput(writerInput)
            videoWriter.addInput(audioWriterInput)

            adapter = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: writerInput, sourcePixelBufferAttributes: DejalActivityView.getAdapterDictionary() as? [String:AnyObject])









            videoWriter.startWriting()
            videoWriter.startSessionAtSourceTime(starTime)

            record = true
            myButton.setTitle("stop", forState: .Normal)

        }


    }

    func getCurrentDate()->String{
        let format = NSDateFormatter()
        format.dateFormat = "dd-MM-yyyy hh:mm:ss"
        format.locale = NSLocale(localeIdentifier: "en")
        let date = format.stringFromDate(NSDate())
        return date
    }


extension newCustomCameraViewController:AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate{


    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
        starTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)

        if captureOutput == videoOutput {
            connection.videoOrientation = AVCaptureVideoOrientation.Portrait

            let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
            let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)

            let comicEffect = CIFilter(name: "CIHexagonalPixellate")

            comicEffect!.setValue(cameraImage, forKey: kCIInputImageKey)

            let filteredImage = UIImage(CIImage: comicEffect!.valueForKey(kCIOutputImageKey) as! CIImage!)
            //let filteredImage = UIImage(CIImage: cameraImage)
            if self.record == true{

                dispatch_sync(dispatch_queue_create("sample buffer append", DISPATCH_QUEUE_SERIAL), {
                    if self.record == true{
                        if self.writerInput.readyForMoreMediaData {
                        let bo = self.adapter.appendPixelBuffer(DejalActivityView.pixelBufferFromCGImage(self.convertCIImageToCGImage(comicEffect!.valueForKey(kCIOutputImageKey) as! CIImage!)).takeRetainedValue() as CVPixelBufferRef, withPresentationTime: self.starTime)

                        print("video is \(bo)")
                        }
                    }
                })
            }
            dispatch_async(dispatch_get_main_queue())
            {
                self.myImage.image = filteredImage

            }
        }else if captureOutput == audioOutput{

            if self.record == true{

                let bo = audioWriterInput.appendSampleBuffer(sampleBuffer)
                print("audio is \(bo)")
            }
        }



    }


    func convertCIImageToCGImage(inputImage: CIImage) -> CGImage! {
        let context:CIContext? = CIContext(options: nil)
        if context != nil {
            return context!.createCGImage(inputImage, fromRect: inputImage.extent)
        }
        return nil
    }

    func video(videoPath: NSString, didFinishSavingWithError error: NSError?, contextInfo info: AnyObject) {
        var title = "Success"
        var message = "Video was saved"

        if let saveError = error {
            title = "Error"
            message = "Video failed to save"
        }

        let alert = UIAlertController(title: title, message: message, preferredStyle: .Alert)
        alert.addAction(UIAlertAction(title: "OK", style: UIAlertActionStyle.Cancel, handler: nil))
        presentViewController(alert, animated: true, completion: nil)
    }

DejalActivityView 中的这些方法在 objective-c 中,我无法将其转换为 Swift,因此如果有人可以转换它,请编辑我的代码并进行转换

+ (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options, &pxbuffer);
    // CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
    NSParameterAssert(context);

    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);

    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

+(NSDictionary *)getAdapterDictionary{


    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
                                                           [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

    return sourcePixelBufferAttributesDictionary;
}

+(NSDictionary *) getAudioDictionary{
    AudioChannelLayout acl;
    bzero( &acl, sizeof(acl));
    acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;


    NSDictionary* audioOutputSettings = nil;
    audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
                           [ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
                           //[ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
                           [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                           [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
                           [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                           nil ];
//    NSDictionary* audioOutputSettings = nil;
//        audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
//                               [ NSNumber numberWithInt: kAudioFormatMPEG4AAC_HE_V2 ], AVFormatIDKey,
//                               [ NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
//                               [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
//                               nil ];

    return audioOutputSettings;
}

最佳答案

你需要添加一个AVAssetWriter

var videoRecorder: AVAssetWriter?

然后在您的委托(delegate)回调中:

let timeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)

if videoRecorder?.status == .Unknown {
    startRecordingTime = timeStamp
    videoRecorder?.startWriting()
    videoRecorder?.startSessionAtSourceTime(timeStamp)
}

您需要为您希望进行的每个录音配置录音机,您还需要将您的输入添加到录音机。

您可能会开始遇到问题,因为您似乎还没有任何您需要的队列设置,但作为引用,这个 Github 是一个非常好的资源。

https://github.com/waleedka/rosywriterswift

编辑:附加信息

您需要 init() 编写器,然后为视频/音频添加输入 AVAssetWriterInput

关于ios - 使用 AVCaptureSession 录制视频,添加 CIFilter 并保存到相册,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36600442/

相关文章:

ios - 如何用 NSNumbers 做数学运算

ios - 单击文本字段后 UIView 重新加载?

ios - 当用户停止使用 swift 5 说话时如何终止语音识别

ios - 将音频和视频流式传输到 IOS 应用程序

java - 视频播放询问安全定时器重置

iOS:Google Places Pod - 将 Pod 版本升级到 3.1.0 后出现使用限制错误

ios - 如何在没有空间问题的情况下创建各种大小的 collectionView 单元格

video - FFmpeg 没有正确延迟音频

objective-c - 我如何在 iphone 应用程序中设置表格 View 单元格属性

ios - 来自带有 Storyboard 的 UIViewController 子类的子类不起作用,除非