ios - 使用 avcapture session 拉伸(stretch)捕获图像

标签 ios iphone uiimage avfoundation avcapturesession

我正在使用 avcaptureSession 拍照
图片就在下面。
我使用正确的方法还是有什么问题?
我也更改了预设但没有成功

这是拍照前的照片
enter image description here

输出是这样的(拉伸(stretch))
enter image description here

我的代码是:

     AVCaptureDeviceInput*  input1 = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
    AVCaptureVideoDataOutput* output1 = [[AVCaptureVideoDataOutput alloc] init];
    output1.alwaysDiscardsLateVideoFrames = YES;
    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    [output1 setSampleBufferDelegate:self queue:queue];
    NSString* key = (NSString *) kCVPixelBufferPixelFormatTypeKey;
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
    [output1 setVideoSettings:videoSettings];

    self.captureSession = [[AVCaptureSession alloc] init];

    [self.captureSession addInput:input1];
    [self.captureSession addOutput:output1];
    [self.captureSession setSessionPreset:AVCaptureSessionPresetiFrame960x540];

    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

    // CHECK FOR YOUR APP
    self.previewLayer.frame = CGRectMake(self.cameraImageView.frame.origin.x, self.cameraImageView.frame.origin.y, self.img_view.frame.size.width, self.img_view.frame.size.height);


    [self.previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
       [self.view_captureImage.layer insertSublayer:self.previewLayer atIndex:0];           //[self.captureSession startRunning];


- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"im captueoutput");
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);



   self.cameraImage= [UIImage imageWithCGImage:newImage
                        scale:1.0
                  orientation: UIImageOrientationRight];
    CGImageRelease(newImage);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
   // free(baseAddress);

}
- (IBAction)snapshot:(id)sender
{
        NSLog(@"image snap");
        [self.captureSession stopRunning];
        [self.cameraImageView setImage:self.cameraImage];
        UIImage *img=self.cameraImageView.image;
        [self.img_view setImage:img];
        [self.view_captureImage setHidden:YES];
}

最佳答案

我认为问题出在您的 UIImageViews 中。尝试将 contentMode 设置为 UIViewContentModeScaleAspectFill

[imageView setContentMode:UIViewContentModeScaleAspectFill] 

关于ios - 使用 avcapture session 拉伸(stretch)捕获图像,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32478832/

相关文章:

android - 手机游戏开发需要opengl吗?

iphone - 如何改变一半的 UIImage 或 UIImageView 的 alpha 值

php - iOS Swift 应用程序内存问题 - 运行后从内存中删除函数?

iphone - 右对齐 EntryElement

ios - 从 NSURL 获取 MBProgressHUD 的进度

iphone - 使用 NSTimer 更新标签以在 10 秒内显示 10000 到 0

ios - 在 iOS 应用程序中存储数据的最佳实践?

php - iOS 到 PHP 的错误编码(非英语变成乱码)

ios - 如何将 UIImage 的图像作为 NSString 获取?

objective-c - 使用 UIImageWriteToSavedPhotosAlbum 保存图像时的文件名是什么