ios - 从 AVCaptureStillImageOutput 解压缩的 UIImage

标签 ios uiimage avfoundation ios-camera core-video

这是我迄今为止尝试的相机配置:

    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    [session setSessionPreset:AVCaptureSessionPresetInputPriority];

    AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];

    NSError *errorVideo;

    AVCaptureDeviceFormat *deviceFormat = nil;
    for (AVCaptureDeviceFormat *format in videoDevice.formats) {
        CMVideoDimensions dim = CMVideoFormatDescriptionGetDimensions(format.formatDescription);

        if (dim.width == 2592 && dim.height == 1936) {
            deviceFormat = format;
            break;
        }
    }

    [videoDevice lockForConfiguration:&errorVideo];
    if (deviceFormat) {
        videoDevice.activeFormat = deviceFormat;

        if ([videoDevice isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) {
            [videoDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
        }

        if ([videoDevice isAutoFocusRangeRestrictionSupported]) {
            [videoDevice setAutoFocusRangeRestriction:AVCaptureAutoFocusRangeRestrictionFar];
        }
    }
    [videoDevice unlockForConfiguration];

    AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

    if ([session canAddInput:videoDeviceInput]) {
        [session addInput:videoDeviceInput];
    }

    AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];

    if ([session canAddOutput:stillImageOutput]) {
        [stillImageOutput setOutputSettings:@{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)}];
        [session addOutput:stillImageOutput];
    }

这是我尝试从 CMSamplebuffer 获取 UIImage 的方法:

 [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

        if (imageDataSampleBuffer && !error) {
            dispatch_async(dispatch_get_main_queue(), ^{
                UIImage *image = [self imageFromSampleBuffer:imageDataSampleBuffer];
            });
        }
    }];

这是一个 Apple 示例代码:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

CVPixelBufferLockBaseAddress(imageBuffer, 0);

void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);



// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                             bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);


// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);

// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];

// Release the Quartz image
CGImageRelease(quartzImage);

return (image);
}

但图像始终为零。 经过一些调试。我发现这个函数总是返回nil CMSampleBufferGetImageBuffer(sampleBuffer);

有人可以帮忙吗?

最佳答案

这是因为 CMSampleBufferRef 必须立即处理,因为它会非常快速有效地释放。

这是我生成图像的代码:

 let connection = imageFileOutput.connectionWithMediaType(AVMediaTypeVideo)

if  connection != nil {
    imageFileOutput.captureStillImageAsynchronouslyFromConnection(connection) { [weak self] (buffer, err) -> Void in
        if CMSampleBufferIsValid(buffer) {
            let imageDataJpeg = self?.imageFromSampleBuffer(buffer)
        } else {
            print(err)
        }
    }
}

如您所见,我将其转换为图像,同时仍在该函数的范围内。一旦它是图像,我就会将它发送出去进行处理。

关于ios - 从 AVCaptureStillImageOutput 解压缩的 UIImage,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36935193/

相关文章:

ios - 使用 UIActivityViewController 发送的 UIImage 的控制文件名

iOS:请求访问相机

ios - AVPlayer 永远无法播放

ios - 快速从单独的 nib 文件加载 UIViewController?

objective-c - @synchronized() 作为 Objective-C 中的单例方法有什么作用?

ios - 使用 UIImageView 设置缩放 UIScrollView

ios - 在 iOS 中通过 AVAssetReaderTrackOutput 获取样本缓冲区时音频丢失?

objective-c - 打开个人热点?

ios - 如何从 UIImageView 上传图片到 Parse.com

iphone - 实现类似于 iPhone 联系人应用程序的 "add photo"按钮