我有一个在 UIView 中显示实时视频的 AVCaptureSession
,我想将视频流的一帧保存为 UIImage。我一直在剖析我在互联网上经常看到的代码,但我在第一行遇到了问题:
if let stillOutput = self.stillImageOutput {
// Establish an AVCaptureConnection and capture a still image from it.
}
这给我错误“相机”没有名为“stillImageOutput”的成员。代码取决于能够从输出中获取视频连接。
如果有帮助,我可以发布完整的代码块。谢谢!
最佳答案
一旦有了stillImageOutput
,您就可以使用以下方法来捕获图像
stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}
// I had to add timer otherwise quality was messed up in iPad
var timer = NSTimer.scheduledTimerWithTimeInterval(0.4, target: self, selector: Selector("getImage"), userInfo: nil, repeats: false)
比起我获取图像的函数
func getImage() {
if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
(imageDataSampleBuffer, error) -> Void in
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
// Use your image or store to Album
// UIImageWriteToSavedPhotosAlbum(UIImage(data: imageData), nil, nil, nil)
self.stopSession()
}
}
}
并停止 session 并删除预览层
func stopSession(){
self.captureSession.stopRunning()
self.previewLayer?.removeFromSuperlayer()
}
关于ios - 在 Swift 中从 AVCaptureSession 捕获静止图像,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/26529787/