ios - 无法将 AVCaptureVideoDataOutputSampleBufferDelegate 实现为自身

标签 ios xcode swift swift2 avfoundation

我是 Swift 的新手,我一直在尝试使用 OpenCV 创建一个实时视频处理应用程序。我使用的是 Swift、iOS 9 和 Xcode 7。

我在 try catch 帧时遇到了麻烦,我从几个教程中得出了以下代码。

  1. 在下面的代码中,我不断得到一个错误:

    func setupCameraSession() {
        let devices = AVCaptureDevice.devices()
        var captureDevice:AVCaptureDevice?
    
        do {
            if cameraType == CameraType.Front {
                for device in devices {
                    if device.position == AVCaptureDevicePosition.Front {
                        captureDevice = device as? AVCaptureDevice
                        break
                    }
                }
            }
            else {
                captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) as AVCaptureDevice
            }
    
            let deviceInput = try AVCaptureDeviceInput(device: captureDevice)
    
            cameraSession.beginConfiguration()
    
            if (cameraSession.canAddInput(deviceInput) == true) {
                cameraSession.addInput(deviceInput)
            }
    
            let dataOutput = AVCaptureVideoDataOutput()
            dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) : NSNumber(unsignedInt: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)]
            dataOutput.alwaysDiscardsLateVideoFrames = true
            dataOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("cameraQueue", DISPATCH_QUEUE_SERIAL))
    
            if (cameraSession.canAddOutput(dataOutput) == true) {
                cameraSession.addOutput(dataOutput)
            }
    
            cameraSession.commitConfiguration()
    
        }
        catch let error as NSError {
            NSLog("\(error), \(error.localizedDescription)")
        }
    }
    

错误如下:

FirstViewController.swift:137:48: 
Cannot convert value of type 'FirstViewController' to expected argument 
type 'AVCaptureVideoDataOutputSampleBufferDelegate!'

在函数 setSampleBufferDelegate 中使用“self”导致错误。

我相信这是捕获每一帧的关键部分,但我不确定它的作用。

  1. 我还想知道我应该如何使用以下函数来捕获帧并将它们处理为 UIImage:

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
        // Here you collect each frame and process it
        print("frame received")
    }
    
    func captureOutput(captureOutput: AVCaptureOutput!, didDropSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
        // Here you can count how many frames are dropped
        print("frame dropped")
    }
    

何时何地调用这些函数,我应该如何将每个捕获的帧转换为 UIImage?

以下是 FirstViewController 类的完整代码:

//
//  FirstViewController.swift
//  nVisoDemoApp
//
//  Created by Timothy Llewellynn on 30/06/16.
//  Copyright © 2016 Timothy Llewellynn. All rights reserved.
//

import UIKit
import AVFoundation

class FirstViewController: UIViewController, UITabBarControllerDelegate {

    @IBOutlet weak var OpenCVVersion: UILabel!
    @IBOutlet weak var OpenCVDisplay: UIImageView!

    @IBOutlet weak var SadnessValue: UILabel!
    @IBOutlet weak var NeutralValue: UILabel!
    @IBOutlet weak var DisgustValue: UILabel!
    @IBOutlet weak var AngerValue: UILabel!
    @IBOutlet weak var SurpriseValue: UILabel!
    @IBOutlet weak var FearValue: UILabel!
    @IBOutlet weak var HappinessValue: UILabel!

    enum CameraType {
        case Front
        case Back
    }

    var cameraType = CameraType.Front

    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.//        SadnessValue.text = "[Value]"
//        NeutralValue.text = "[Value]"
//        DisgustValue.text = "[Value]"
//        AngerValue.text = "[Value]"
//        SurpriseValue.text = "[Value]"
//        FearValue.text = "[Value]"
//        HappinessValue.text = "[Value]"

//      OpenCVDisplay.image =

        self.view.sendSubviewToBack(OpenCVDisplay)
        setupCameraSession()
        OpenCVVersion.text = CVWrapper.versionOpenCV()
        OpenCVDisplay.layer.addSublayer(previewLayer)
        cameraSession.startRunning()

        let leftSwipe = UISwipeGestureRecognizer(target: self, action: Selector("handleSwipes:"))
        leftSwipe.direction = .Left
        view.addGestureRecognizer(leftSwipe)
    }

    func handleSwipes(sender:UISwipeGestureRecognizer) {
        if (sender.direction == .Left) {
            let selectedIndex: Int = self.tabBarController!.selectedIndex
            self.tabBarController!.selectedIndex = selectedIndex + 1
        }

        if (sender.direction == .Right) {

        }
    }

    override func viewDidAppear(animated: Bool) {
        super.viewDidAppear(animated)

        self.view.sendSubviewToBack(OpenCVDisplay)
        setupCameraSession()
        OpenCVVersion.text = CVWrapper.versionOpenCV()
        OpenCVDisplay.layer.addSublayer(previewLayer)
        cameraSession.startRunning()
    }

    override func viewWillDisappear(animated: Bool) {
        super.viewWillDisappear(animated)

        cameraSession.stopRunning()
        previewLayer.removeFromSuperlayer()

        let currentCameraInput: AVCaptureInput = cameraSession.inputs[0] as! AVCaptureInput
        cameraSession.removeInput(currentCameraInput)
    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
        // Dispose of any resources that can be recreated.
    }

    lazy var cameraSession: AVCaptureSession = {
        let s = AVCaptureSession()
        s.sessionPreset = AVCaptureSessionPresetHigh
        return s
    }()

    lazy var previewLayer: AVCaptureVideoPreviewLayer = {
        let preview =  AVCaptureVideoPreviewLayer(session: self.cameraSession)
        preview.bounds = CGRect(x: 0, y: 0, width: self.view.bounds.width, height: self.view.bounds.height)
        preview.position = CGPoint(x: CGRectGetMidX(self.view.bounds), y: CGRectGetMidY(self.view.bounds))
        preview.videoGravity = AVLayerVideoGravityResize
        return preview
    }()

    func setupCameraSession() {
        let devices = AVCaptureDevice.devices()
        var captureDevice:AVCaptureDevice?

        do {
            if cameraType == CameraType.Front {
                for device in devices {
                    if device.position == AVCaptureDevicePosition.Front {
                        captureDevice = device as? AVCaptureDevice
                        break
                    }
                }
            }
            else {
                captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) as AVCaptureDevice
            }

            let deviceInput = try AVCaptureDeviceInput(device: captureDevice)

            cameraSession.beginConfiguration()

            if (cameraSession.canAddInput(deviceInput) == true) {
                cameraSession.addInput(deviceInput)
            }

            let dataOutput = AVCaptureVideoDataOutput()
            dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) : NSNumber(unsignedInt: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)]
            dataOutput.alwaysDiscardsLateVideoFrames = true

//            let queue = dispatch_queue_create("cameraQueue", DISPATCH_QUEUE_SERIAL)
//            dataOutput.setSampleBufferDelegate(self, queue: queue)
            dataOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("cameraQueue", DISPATCH_QUEUE_SERIAL))
            if (cameraSession.canAddOutput(dataOutput) == true) {
                cameraSession.addOutput(dataOutput)
            }
            /Users/tllewellynn/Desktop/dev/nVisoDemo/nVisoDemo/FirstViewController.swift:137:48: Cannot convert value of type 'FirstViewController' to expected argument type 'AVCaptureVideoDataOutputSampleBufferDelegate!'
            cameraSession.commitConfiguration()

        }
        catch let error as NSError {
            NSLog("\(error), \(error.localizedDescription)")
        }
    }

//    func capturePicture(){
//        
//        print("Capturing image")
//        var stillImageOutput = AVCaptureStillImageOutput()
//        stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
//        cameraSession.addOutput(stillImageOutput)
//        
//        if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo){
//            stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
//                (sampleBuffer, error) in
//                var imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
//                var dataProvider = CGDataProviderCreateWithCFData(imageData)
//                var cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
////                var image = UIImage(CGImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.Right)
//                
////                var imageView = UIImageView(image: image)
////                imageView.frame = CGRect(x:0, y:0, width:self.screenSize.width, height:self.screenSize.height)
////                
////                //Show the captured image to
////                self.view.addSubview(imageView)
////                
////                //Save the captured preview to image
////                UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
//                
//            })
//        }
//    }

    @IBAction func SwitchCameraAction(sender: UIButton) {
        cameraType = cameraType == CameraType.Back ? CameraType.Front : CameraType.Back
        cameraSession.stopRunning()
        previewLayer.removeFromSuperlayer()

        let currentCameraInput: AVCaptureInput = cameraSession.inputs[0] as! AVCaptureInput
        cameraSession.removeInput(currentCameraInput)

        setupCameraSession()
        OpenCVDisplay.layer.addSublayer(previewLayer)
        cameraSession.startRunning()
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
        // Here you collect each frame and process it
        print("frame received")
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didDropSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
        // Here you can count how many frames are dropped
        print("frame dropped")
    }
}

有什么见解吗?

最佳答案

class FirstViewController: UIViewController, UITabBarControllerDelegate 

只是改变

class FirstViewController: UIViewController, UITabBarControllerDelegate,AVCaptureVideoDataOutputSampleBufferDelegate

关于ios - 无法将 AVCaptureVideoDataOutputSampleBufferDelegate 实现为自身,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38149917/

相关文章:

ios - 下拉刷新时 UITableview 出错,所有数据重复出现。任何人都可以帮助使用 Swift3 , Xcode8

ios - 从网络服务器安装应用程序

ios - 如何在 iOS 的导航中制作带有 Logo 和文本的自定义导航栏?

ios - 创建通用应用程序的原则

objective-c - Xcode 4.1 (Lion) 和 Xcode 3.2.5 (snow leopard) 之间 .xcdatamodeld 的兼容性问题

ios - 如何获取注释放置 iOS 的时间

ios - 在应用程序中检测 Googles chromecast 服务的使用情况

ios - 如何读取外围设备的蓝牙 LE 地址?

ios - 在 Swift 中以编程方式创建 UITextView 时未获得输出

ios - DropBox 文件夹共享