ios - 如何使用接近传感器拍照?

标签 ios swift image uiimageview proximitysensor

当接近传感器启用时,我无法让设备使用后视摄像头拍摄图像。我不希望显示相机预览,只希望设备拍摄照片并将其呈现在 imageView 中。我有接近传感器 工作,并且在启用接近传感器 时使用imagePicker.takePicture() 拍摄图像,但是似乎不起作用。在没有用户输入的情况下,我可以使用什么方法/功能以编程方式拍摄照片。

到目前为止,这是我的代码:

class ViewController: UIViewController, UINavigationControllerDelegate, UIImagePickerControllerDelegate {

@IBOutlet var imageView: UIImageView!

var imagePicker: UIImagePickerController!

//*The function in question*  
func proximityChanged(notification: NSNotification) {
    let device = notification.object as? UIDevice
    if device?.proximityState == true {
        print("\(device) detected!")

最佳答案

如果您在使用 UIImagePickerController 拍摄照片时遇到问题,我建议您使用 AVFoundation

下面是一个工作示例。照片捕获由接近传感器触发。

如果需要,您可以添加预览。

import UIKit
import AVFoundation

final class CaptureViewController: UIViewController {

    @IBOutlet weak var imageView: UIImageView!

    private static let captureSessionPreset = AVCaptureSessionPresetPhoto
    private var captureSession: AVCaptureSession!
    private var photoOutput: AVCaptureStillImageOutput!
    private var initialized = false

    override func viewDidLoad() {
        super.viewDidLoad()
        initialized = setupCaptureSession()
    }

    override func viewWillAppear(animated: Bool) {
        super.viewWillAppear(animated)
        if initialized {
            captureSession.startRunning()
            UIDevice.currentDevice().proximityMonitoringEnabled = true
            NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(proximityStateDidChange), name: UIDeviceProximityStateDidChangeNotification, object: nil)
        }
    }

    override func viewDidDisappear(animated: Bool) {
        super.viewDidDisappear(animated)
        if initialized {
            NSNotificationCenter.defaultCenter().removeObserver(self, name: UIDeviceProximityStateDidChangeNotification, object: nil)
            UIDevice.currentDevice().proximityMonitoringEnabled = false
            captureSession.stopRunning()
        }
    }

    dynamic func proximityStateDidChange(notification: NSNotification) {
        if UIDevice.currentDevice().proximityState {
            captureImage()
        }
    }

    // MARK: - Capture Image

    private func captureImage() {
        if let c = findConnection() {
            photoOutput.captureStillImageAsynchronouslyFromConnection(c) { sampleBuffer, error in
                if let jpeg  = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer),
                   let image = UIImage(data: jpeg)
                {
                    dispatch_async(dispatch_get_main_queue()) { [weak self] in
                        self?.imageView.image = image
                    }
                }
            }
        }
    }

    private func findConnection() -> AVCaptureConnection? {
        for c in photoOutput.connections {
            let c = c as? AVCaptureConnection
            for p in c?.inputPorts ?? [] {
                if p.mediaType == AVMediaTypeVideo {
                    return c
                }
            }
        }
        return nil
    }

    // MARK: - Setup Capture Session

    private func setupCaptureSession() -> Bool {
        captureSession = AVCaptureSession()
        if  captureSession.canSetSessionPreset(CaptureViewController.captureSessionPreset) {
            captureSession.sessionPreset = CaptureViewController.captureSessionPreset
            if setupCaptureSessionInput() && setupCaptureSessionOutput() {
                return true
            }
        }
        return false
    }

    private func setupCaptureSessionInput() -> Bool {
        if let captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo),
           let captureDeviceInput = try? AVCaptureDeviceInput.init(device: captureDevice)
        {
            if  captureSession.canAddInput(captureDeviceInput) {
                captureSession.addInput(captureDeviceInput)
                return true
            }
        }
        return false
    }

    private func setupCaptureSessionOutput() -> Bool {
        photoOutput = AVCaptureStillImageOutput()
        photoOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
        if  captureSession.canAddOutput(photoOutput) {
            captureSession.addOutput(photoOutput)
            return true
        }
        return false
    }

}

关于ios - 如何使用接近传感器拍照?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34257044/

相关文章:

ios - 体系结构 i386 的 undefined symbol : _FlutterMethodNotImplemented

swift - 如何处理基于单元格的 NSOutlineView 中的复选框单击

iOS Swift - 从选定的 TableView 单元格获取背景颜色

php - 将blob类型图像上传到数据库,我的代码说它正在上传图像但没有存储在数据库中

objective-c - 从 iPhone 应用程序编辑我的应用程序 settings.bundle 的设置

ios - 上传应用到AppStore,只允许登录

android - 在android中以编程方式将图像转换为卡通和油画图像

excel - 使用openpyxl在excel中查找图像和删除图像

android - cordova-plugin-whitelist 适用于 Android 但不适用于 iOS (Phonegap Build)

ios - 验证玩家时无成员 'presentViewController'