ios - 在 Swift 3 中使用 AVFoundation 设置图片方向

标签 ios swift camera avfoundation orientation

我是 Swift 的新手,我没有太多的编程练习,所以也许这个问题对你来说很容易解决 O:-)...

我只需要一个带有 AVFoundation 的简单相机来拍摄照片,不需要实时模式或视频录制。 我对视频方向有疑问,目前无法在论坛中找到答案。我知道很多人都有这个问题,但我尝试的所有方法都没有用,我无法在我的代码中解决这个问题。我真的不明白用 AVFoundation 捕获图片是如何工作的以及图片的确切拍摄位置......

你能告诉我如何为图片设置设备图像方向,以便在 ImageView 中使用图片时它会以设备方向显示吗?

这是我的代码:

import UIKit
import AVFoundation
import Photos

class ViewController: UIViewController {

@IBOutlet weak var cameraView: UIView!

@IBOutlet weak var imageProcessedView: UIImageView!
var imageProcessed: UIImage?
let captureSession = AVCaptureSession()
var captureDevice: AVCaptureDevice?
var previewLayer: AVCaptureVideoPreviewLayer?
var stillImageOutput: AVCaptureStillImageOutput = AVCaptureStillImageOutput()

override func viewDidLoad() {
    super.viewDidLoad()
    UIDevice.current.beginGeneratingDeviceOrientationNotifications()

    imageProcessedView.alpha = 1.0
    captureSession.sessionPreset = AVCaptureSessionPresetPhoto
    backCamera()
}

func backCamera()
{
    let devices = AVCaptureDevice.devices()

    for device in devices! {
        if ((device as AnyObject).hasMediaType(AVMediaTypeVideo)){
            captureDevice = device as? AVCaptureDevice
            do {
                try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
            } catch {
                print("error")
            }
            break
        }
    }
 }


//Session starts and preview appears:
@IBAction func takePhoto(_ sender: Any) {
    if captureDevice != nil
    {
        imageProcessedView.alpha = 0.0
        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        self.cameraView.layer.addSublayer(previewLayer!)
        previewLayer?.frame = self.cameraView.layer.bounds
        previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
        captureSession.startRunning()
        stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]

        if captureSession.canAddOutput(stillImageOutput){
            captureSession.addOutput(stillImageOutput)
        }
        }else{
            print("No captureDevice")
        }

}

@IBAction func capturePicture(_ sender: Any) {
    if let videoConnection = stillImageOutput.connection(withMediaType: AVMediaTypeVideo){

        var currentDevice: UIDevice
        currentDevice = .current

        var deviceOrientation: UIDeviceOrientation
        deviceOrientation = currentDevice.orientation

        //variables to set the orientation as parameter like "portrait" and as rawValue like the integer 0 for portrait mode
        var avCaptureOrientation: AVCaptureVideoOrientation
        avCaptureOrientation = .portrait

        var orientationValue: Int
        orientationValue = 0

        if deviceOrientation == .portrait {
            avCaptureOrientation = .portrait
            orientationValue = 0
            print("Device: Portrait")
        }else if (deviceOrientation == .landscapeLeft){
            avCaptureOrientation = .landscapeLeft
            orientationValue = 3
            print("Device: LandscapeLeft")
        }else if (deviceOrientation == .landscapeRight){
            avCaptureOrientation = .landscapeRight
            orientationValue = 2
            print("Device LandscapeRight")
        }else if (deviceOrientation == .portraitUpsideDown){
            avCaptureOrientation = .portraitUpsideDown
            orientationValue = 1
            print("Device PortraitUpsideDown")
        }else{
            print("Unknown Device Orientation")
        }

        stillImageOutput.captureStillImageAsynchronously(from: videoConnection, completionHandler: {(imageDataSampleBuffer, error) in
        let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)

            let image = UIImage(data: imageData!)
            self.imageProcessed = image!
            //In this print command always "3" appears
            print("Metadata Orientation: \(image!.imageOrientation.rawValue)")
            //in this view the orientation is correct but later when I use the picture to display it in an imageView it is wrong
            self.cameraView.backgroundColor = UIColor(patternImage: image!)

            self.captureSession.stopRunning()
        })
    }
}

非常感谢!!!

码头


好的,现在我稍微更改了我的代码。我可以看到正确的 DeviceOrientation,并且 imageOrientation 似乎也是正确的,如打印命令中所示。

但我仍然有问题,当我将图片放入 imageView 时,图片的方向是错误的(它也缩放得太大了)。

我想把图片设置到image view中的函数是:

@IBAction func showCirclesInPic(_ sender: Any) {
    if imageProcessed != nil {
        previewLayer!.removeFromSuperlayer()
        cameraView.addSubview(imageProcessedView)

        if (imageProcessed!.imageOrientation != .up){
            UIGraphicsBeginImageContextWithOptions(imageProcessed!.size, false, imageProcessed!.scale)
            imageProcessed!.draw(in: CGRect(x:0, y:0, width: imageProcessed!.size.width, height: imageProcessed!.size.height))
            imageProcessed = UIGraphicsGetImageFromCurrentImageContext()!
            UIGraphicsEndImageContext()
        }
        imageProcessedView.contentMode = .scaleAspectFit
        imageProcessedView.image = imageProcessed
        //print("ImageProcessed: \(ImageProcessingCV.showCircles(imageProcessed)!)")
    }
}

这是我更改后的代码的其余部分:

import UIKit
import AVFoundation
import Photos

class ViewController: UIViewController {

@IBOutlet weak var cameraView: UIView!

@IBOutlet weak var imageProcessedView: UIImageView!
var imageProcessed: UIImage?
let captureSession = AVCaptureSession()
var captureDevice: AVCaptureDevice?
var previewLayer: AVCaptureVideoPreviewLayer?
var stillImageOutput: AVCaptureStillImageOutput = AVCaptureStillImageOutput()



override func viewDidLoad() {
    super.viewDidLoad()

    imageProcessedView.alpha = 1.0
    captureSession.sessionPreset = AVCaptureSessionPresetPhoto
    backCamera()
}



func backCamera()
{
    let devices = AVCaptureDevice.devices()

    for device in devices! {
        if ((device as AnyObject).hasMediaType(AVMediaTypeVideo)){
            captureDevice = device as? AVCaptureDevice
            do {
                try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
            } catch {
                print("error")
            }
            break
        }
    }
 }



@IBAction func takePhoto(_ sender: Any) {
    if captureDevice != nil
    {
        imageProcessedView.alpha = 0.0
        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        self.cameraView.layer.addSublayer(previewLayer!)
        previewLayer?.frame = self.cameraView.layer.bounds
        previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
        captureSession.startRunning()
        stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]

        if captureSession.canAddOutput(stillImageOutput){
            captureSession.addOutput(stillImageOutput)
        }
        }else{
            print("No captureDevice")
        }

}

@IBAction func capturePicture(_ sender: Any) {
    if let videoConnection = stillImageOutput.connection(withMediaType: AVMediaTypeVideo){

        var currentDevice: UIDevice
        currentDevice = .current
        UIDevice.current.beginGeneratingDeviceOrientationNotifications()
        var deviceOrientation: UIDeviceOrientation
        deviceOrientation = currentDevice.orientation

        var imageOrientation: UIImageOrientation?

        if deviceOrientation == .portrait {
            imageOrientation = .up
            print("Device: Portrait")
        }else if (deviceOrientation == .landscapeLeft){
            imageOrientation = .left
            print("Device: LandscapeLeft")
        }else if (deviceOrientation == .landscapeRight){
            imageOrientation = .right
            print("Device LandscapeRight")
        }else if (deviceOrientation == .portraitUpsideDown){
            imageOrientation = .down
            print("Device PortraitUpsideDown")
        }else{
            print("Unknown Device Orientation")
        }
        stillImageOutput.captureStillImageAsynchronously(from: videoConnection, completionHandler: {(imageDataSampleBuffer, error) in
            let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
            let dataProvider  = CGDataProvider(data: imageData! as CFData)
            let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.defaultIntent )
            self.imageProcessed = UIImage(cgImage: cgImageRef!, scale: 1.0, orientation: imageOrientation!)
            //print("Image Orientation: \(self.imageProcessed?.imageOrientation.rawValue)")
            print("ImageO: \(imageOrientation!.rawValue)")

            self.cameraView.backgroundColor = UIColor(patternImage: self.imageProcessed!)

            self.captureSession.stopRunning()
        })
    }
}


@IBAction func focusAndExposeTap(_ gestureRecognizer: UITapGestureRecognizer) {
    let devicePoint = self.previewLayer?.captureDevicePointOfInterest(for: gestureRecognizer.location(in: gestureRecognizer.view))
    focus(with: .autoFocus, exposureMode: .autoExpose, at: devicePoint!, monitorSubjectAreaChange: true)
}

private func focus(with focusMode: AVCaptureFocusMode, exposureMode: AVCaptureExposureMode, at devicePoint: CGPoint, monitorSubjectAreaChange: Bool)
{
    if let device = captureDevice
    {
        do{
            try device.lockForConfiguration()
            if device.isFocusPointOfInterestSupported && device.isFocusModeSupported(focusMode)
            {
                device.focusPointOfInterest = devicePoint
                device.focusMode = focusMode
            }
            if device.isExposurePointOfInterestSupported && device.isExposureModeSupported(exposureMode)
            {
                device.exposurePointOfInterest = devicePoint
                device.exposureMode = exposureMode
            }
            device.isSubjectAreaChangeMonitoringEnabled = monitorSubjectAreaChange
            device.unlockForConfiguration()
        }catch{
            print("Could not lock device for configuration: \(error)")
        }
    }
}


@IBAction func showCirclesInPic(_ sender: Any) {
    if imageProcessed != nil {
        previewLayer!.removeFromSuperlayer()
        cameraView.addSubview(imageProcessedView)

        if (imageProcessed!.imageOrientation != .up){
            UIGraphicsBeginImageContextWithOptions(imageProcessed!.size, false, imageProcessed!.scale)
            imageProcessed!.draw(in: CGRect(x:0, y:0, width: imageProcessed!.size.width, height: imageProcessed!.size.height))
            imageProcessed = UIGraphicsGetImageFromCurrentImageContext()!
            UIGraphicsEndImageContext()
        }
        imageProcessedView.contentMode = .scaleAspectFit
        imageProcessedView.image = imageProcessed
        //print("ImageProcessed: \(ImageProcessingCV.showCircles(imageProcessed)!)")
    }
}


    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
        // Dispose of any resources that can be recreated.
    }
}

最佳答案

为了在拍照时更改相机输出的方向,您应该使用 NotificationCenter 添加方向更改的监听器。我将附上您可能使用的使用 AVFoundation 的相机示例代码。 请注意,捕获和翻转相机是通过编程方式添加的。更重要的是,您可以为翻转摄像头设置图像(有注释告诉您在哪里设置)。
代码修复

//
//  CameraViewController.swift
//  MatchFriend
//
//  Created by Tarek Ezzat Abdallah on 10/1/16.
//  Copyright © 2016 Tarek. All rights reserved.
//

import UIKit
import AVFoundation

class CameraViewController: UIViewController{

    var captureSession: AVCaptureSession?
    var previewLayer : AVCaptureVideoPreviewLayer?
    var stillImageOutput: AVCaptureStillImageOutput?
    var imageToSend: UIImage!
    // If we find a device we'll store it here for later use
    var captureDevice : AVCaptureDevice?

    @IBOutlet weak var cameraView: UIView!
    var captureButton: UIButton!
    var flipCam:UIButton!
    var camPos:String = "back"
    func capture(_ sender: AnyObject)
    {
        if let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo)
        {
            stillImageOutput?.captureStillImageAsynchronously(from: videoConnection)
            {
                (imageDataSampleBuffer, error) -> Void in
                let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
                let dataProvider  = CGDataProvider(data: imageData as! CFData)
                let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.defaultIntent )
                let image = UIImage(cgImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.right)


            }
        }


    }
    override func viewDidLoad()
    {
        super.viewDidLoad()
        cameraView.backgroundColor = .clear
        self.view.backgroundColor = .clear

        captureButton = UIButton(frame: CGRect(x: 160, y: 580, width: 80, height: 80))
        captureButton.addTarget(self, action: #selector(CameraViewController.capture(_:)), for: UIControlEvents.touchUpInside)
        flipCam = UIButton(frame: CGRect(x: 0, y: 0, width: 40, height: 40))
        flipCam.addTarget(self, action: #selector(CameraViewController.flip), for: UIControlEvents.touchUpInside)
        self.view.addSubview(captureButton)
        self.view.addSubview(flipCam)
        captureButton.backgroundColor = UIColor.red
        NotificationCenter.default.addObserver(self, selector: #selector(CameraViewController.rotated), name: NSNotification.Name.UIDeviceOrientationDidChange, object: nil)



        //        let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
        camStart(camPos)
        setViews()

    }
    func camStart(_ position: String)
    {
        captureSession = AVCaptureSession()
        captureSession?.sessionPreset = AVCaptureSessionPresetHigh

        let videoDevices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo)
        var capDevice:AVCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)

        if(position == "back")
        {
            for device in videoDevices!
            {
                let device = device as! AVCaptureDevice
                if device.position == AVCaptureDevicePosition.back
                {
                    capDevice = device
                    break
                }
            }
        }
        else{

            for device in videoDevices!
            {
                let device = device as! AVCaptureDevice
                if device.position == AVCaptureDevicePosition.front
                {
                    capDevice = device
                    break
                }
            }
        }

        //        var error : NSError?
        let input = try? AVCaptureDeviceInput(device: capDevice)

        if (captureSession?.canAddInput(input) != nil){

            captureSession?.addInput(input)

            stillImageOutput = AVCaptureStillImageOutput()
            stillImageOutput?.outputSettings = [AVVideoCodecKey : AVVideoCodecJPEG]

            if (captureSession?.canAddOutput(stillImageOutput) != nil){
                captureSession?.addOutput(stillImageOutput)
                previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
                previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
                previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait
                previewLayer!.frame = self.view.layer.frame
                previewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
                cameraView.layer.addSublayer(previewLayer!)


                captureSession?.startRunning()
            }
        }

    }
    override func viewDidLayoutSubviews() {
        super.viewDidLayoutSubviews()
        captureButton.frame = CGRect(x: self.view.frame.width/2 - 40, y: self.view.frame.height - 70 - 40, width: 80, height: 80)
        flipCam.frame = CGRect(x: self.view.frame.width/6 - 40, y: self.view.frame.height - 60 - 40, width: 80, height: 80)
        //add the image to flip camera
        //flipCam.setImage(UIImage(named: "switchCam"), for: UIControlState.normal)
        flipCam.setTitle("flip", for: .normal)

        if previewLayer != nil{
            previewLayer!.frame = self.view.frame
        }

    }
    override func viewWillAppear(_ animated: Bool)
    {
        super.viewWillAppear(true)
        setViews()
    }

    override func didReceiveMemoryWarning()
    {
        super.didReceiveMemoryWarning()
        // Dispose of any resources that can be recreated.
    }

    func setViews()
    {
        captureButton.layer.cornerRadius = captureButton.layer.bounds.width/2
        captureButton.layer.borderWidth = 1

    }

    func configureDevice()
    {
        let error: NSErrorPointer? = nil
        if let device = captureDevice {
            do {
                try captureDevice!.lockForConfiguration()

            } catch let error1 as NSError
            {
                error??.pointee = error1
            }

            device.focusMode = .locked
            device.unlockForConfiguration()
        }

    }
    func flip()
    {
        captureSession?.stopRunning()
        if camPos == "back"
        {
            camPos = "front"
        }
        else{
            camPos = "back"

        }
        camStart(camPos)

    }
    func rotated() {
        if UIDeviceOrientationIsLandscape(UIDevice.current.orientation) {
            if UIDevice.current.orientation == UIDeviceOrientation.landscapeLeft{
                previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.landscapeRight
            }
            else
            {
                previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.landscapeLeft

            }
        }

        if UIDeviceOrientationIsPortrait(UIDevice.current.orientation) {
            if UIDevice.current.orientation == UIDeviceOrientation.portrait{

                previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait
            }
            else
            {
                previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portraitUpsideDown

            }
        }

    }
}

关于ios - 在 Swift 3 中使用 AVFoundation 设置图片方向,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44395488/

相关文章:

ios - 当 iOS 应用程序处于挂起状态时,如何处理重大位置变化?

ios - 当 NSAllowsArbitraryLoadsInWebContent 为真时无法使用 AVPlayer 播放视频

ios - 获取 Assets 中声明的颜色的浅色或深色变体

xcode - 在 XCode/swift 中检测应用程序崩溃

ios - 将项目迁移到 Swift 4.2 后,应用程序在黑屏上卡住

c++ - 从 QThread 使用 QCamera/QCameraInfo

ios - Swift Eureka SelectableSection baseValue 无法正常工作

json - 从 String 获取 Int 值

c++ - V4L2相机设置帮助

android - 关于 getParameters 失败(空参数)