iphone - 来自 AVCaptureVideoPreviewLayer 的 UIImage

标签 iphone ios uiimage calayer avcapturesession

我正在尝试从视频源中抓取静止图像(基本上是暂停或“快照”功能)。我的项目是使用 Benjamin Loulier's template 设置的.我的问题是,即使我通过 prevLayer(AVCaptureVideoPreviewLayer)在屏幕上显示彩色视频,我已将视频设置设置为灰度,所以我无法从 customLayer 获取 UIImage (一个普通的 CALayer)。

我尝试使用给定 here 的这个函数,但是由于某些愚蠢的原因(显示清晰/透明),这对 AVCaptureVideoPreviewLayer 不起作用。有谁知道将 AVCaptureVideoPreviewLayer 的内容保存为 UIImage 的方法吗?

最佳答案

好的,这是我的回答,由 https://developer.apple.com/library/ios/#qa/qa1702/_index.html 提供

一张纸条。 minFrameDuration 自 iOS 5.0 起已弃用。不确定原因或是否有替代品。

#import <AVFoundation/AVFoundation.h>

// Create and configure a capture session and start it running
- (void)setupCaptureSession 
{
    NSError *error = nil;

    // Create the session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];

    // Configure the session to produce lower resolution video frames, if your 
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    session.sessionPreset = AVCaptureSessionPresetMedium;

    // Find a suitable AVCaptureDevice
    AVCaptureDevice *device = [AVCaptureDevice
                             defaultDeviceWithMediaType:AVMediaTypeVideo];

    // Create a device input with the device and add it to the session.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                    error:&error];
    if (!input) {
        // Handling the error appropriately.
    }
    [session addInput:input];

    // Create a VideoDataOutput and add it to the session
    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [session addOutput:output];

    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    // Specify the pixel format
    output.videoSettings = 
                [NSDictionary dictionaryWithObject:
                    [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                    forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    // If you wish to cap the frame rate to a known value, such as 15 fps, set 
    // minFrameDuration.
    output.minFrameDuration = CMTimeMake(1, 15);

    // Start the session running to start the flow of data
    [session startRunning];

    // Assign session to an ivar.
    [self setSession:session];
}

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput 
         didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
         fromConnection:(AVCaptureConnection *)connection
{ 
    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];

     < Add your code here that uses the image >

}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
      bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

关于iphone - 来自 AVCaptureVideoPreviewLayer 的 UIImage,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/13981874/

相关文章:

iphone - 如何通过 iOS (iPhone) 代码在网页内按下带有 java 脚本事件的按钮?

iphone - 如何禁用 CGContextDrawPDFPage 生成的某些 PDF 上的控制台日志记录

iphone - git commit 不会将文件添加到 xcodeproj

swift - 在 Xcode 6.1 中。 'UIImage?' 没有名为 'size' 的成员错误

ios - 从 UIImagePickerController 调整 UIImage 的大小

iphone - Obj-c 中放置 Category 的好地方在哪里?

iphone - Objective C、C++ 以及从 View Controller 向应用代理发送消息

ios - 奥林巴斯相机套件 : Need to acquire captured image in the original size after shooting

ios - 在 Swift 中将 float 转换为 little endian

ios - 调整 UIImageView 的大小以包装 "aspect fitted"图像