iOS 10 - Objective-C : How to implement AVCapturePhotoOutput() to capture image and videos?

标签 ios objective-c iphone ios10 avcapturesession

我正在尝试从我的应用程序中捕获图像和视频,现在从 iOS 10 开始,"AVCaptureStillImageOutput" 已弃用。

请帮助我在 Objective-C 中实现 AVCapturePhotoOutput

这是我的示例代码:

_avCaptureOutput = [[AVCapturePhotoOutput alloc]init];
_avSettings = [AVCapturePhotoSettings photoSettings];


AVCaptureSession* captureSession = [[AVCaptureSession alloc] init];
[captureSession startRunning];



AVCaptureConnection *connection = [self.movieOutput connectionWithMediaType:AVMediaTypeVideo];

if (connection.active)
{
    //connection is active
    NSLog(@"Connection is active");

    id previewPixelType = _avSettings.availablePreviewPhotoPixelFormatTypes.firstObject;
    NSDictionary *format = @{(NSString*)kCVPixelBufferPixelFormatTypeKey:previewPixelType,(NSString*)kCVPixelBufferWidthKey:@160,(NSString*)kCVPixelBufferHeightKey:@160};

    _avSettings.previewPhotoFormat = format;

    [_avCaptureOutput capturePhotoWithSettings:_avSettings delegate:self];


}
else
{
    NSLog(@"Connection is not active");
    //connection is not active
    //try to change self.captureSession.sessionPreset,
    //or change videoDevice.activeFormat
}

最佳答案

_avCaptureOutput = [[AVCapturePhotoOutput alloc]init];
_avSettings = [AVCapturePhotoSettings photoSettings];

AVCaptureSession* captureSession = [[AVCaptureSession alloc] init];
[captureSession startRunning];

[self.avCaptureOutput capturePhotoWithSettings:self.avSettings delegate:self];

自己必须实现 AVCapturePhotoCaptureDelegate

#pragma mark - AVCapturePhotoCaptureDelegate
-(void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error
{
    if (error) {
        NSLog(@"error : %@", error.localizedDescription);
    }

    if (photoSampleBuffer) {
        NSData *data = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
        UIImage *image = [UIImage imageWithData:data];
    }
}

现在,你得到图像,然后做任何你想做的事。


注意:自 iOS 11 起,-captureOutput:didFinishProcessingPhotoSampleBuffer:... 已弃用,需要使用 -captureOutput:didFinishProcessingPhoto:error: 相反:

- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(nullable NSError *)error
{  
  NSData *imageData = [photo fileDataRepresentation];
  UIImage *image = [UIImage imageWithData:data];
  ...
}

关于iOS 10 - Objective-C : How to implement AVCapturePhotoOutput() to capture image and videos?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39872528/

相关文章:

ios - 如何在 ios 中隐藏 ImageView ?

JQuery Ajax 请求无法在 iPhone 设备浏览器上运行

iphone - 从iPhone流音频

iphone - 内存分析和内存泄漏有什么区别?

iphone - 如何在 UIView 中确定手势识别器和触摸的优先级

iphone - 卡在 NSManagedObjectContext 的保存 :

ios - XMPPFramework - 如何创建消息/对话线程

ios - 将 UISwipeGestureRecognizer 附加到多个 View

IOS Swift Kingfisher Resize 处理器结果图像模糊

ios - 为什么我的 UITableView 没有完全加载单元格内容?