我尝试了一个示例,使用文档中给出的 AV Foundation 从摄像机捕获视频帧作为图像,即
http://developer.apple.com/library/ios/#qa/qa1702/_index.html
但是在从 CMSampleBufferRef
获取 UIImage 对象的委托(delegate)方法中,给出的方法并未构建。
意味着我也导入了 AVFoundation 框架,但它给出了 14 个错误,例如 _CVPixelBufferUnlockBaseAddress
,引用自:.
谁知道怎么解决请帮帮我。
在此先感谢。如果有人知道请告诉我。
最佳答案
这是捕获数据并从捕获的数据中获取图像的代码:
-(IBAction)startCapture
{
//session object
captureSession = [[AVCaptureSession alloc]init];
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = CGRectMake(0, 10, 320, 200); ////self.view.frame; //
[self.view.layer addSublayer:previewLayer];
NSError *error = nil;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
//input object
AVCaptureDeviceInput *inputDevice = [[AVCaptureDeviceInput alloc]initWithDevice:device error:&error];
[captureSession addInput:inputDevice];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];
[captureSession startRunning];
}
-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSLog(@"exif Attachments:%@",exifAttachments);
if (exifAttachments)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
// Do something with the attachments.
}
else
NSLog(@"no attachments");
}];
}
关于iphone - 从 CMSampleBuffer 转换为 UIImage 对象,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/5645055/