ios - 从CMSampleBufferRef创建UIImage时,iOS内存增加

标签 ios memory-leaks uiimage cmsamplebufferref

我正在从UIImage创建CMSampleBufferRef对象。我在一个单独的队列(在后台)中执行此操作,因此将处理包括在@autorealease池中。问题在于内存正在建立,没有任何泄漏通知。贝娄是我使用的方法:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    @autoreleasepool {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // Get the number of bytes per row for the pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

       // Get the number of bytes per row for the pixel buffer
       size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
       // Get the pixel buffer width and height
       size_t width = CVPixelBufferGetWidth(imageBuffer);
       size_t height = CVPixelBufferGetHeight(imageBuffer);

       // Create a device-dependent RGB color space
       CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

       // Create a bitmap graphics context with the sample buffer data
       CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
      // Create a Quartz image from the pixel data in the bitmap graphics context
       CGImageRef quartzImage = CGBitmapContextCreateImage(context);
       // Unlock the pixel buffer
       CVPixelBufferUnlockBaseAddress(imageBuffer,0);

       // Free up the context and color space
       CGContextRelease(context);
       CGColorSpaceRelease(colorSpace);

       // Create an image object from the Quartz image
       UIImage *image = [[UIImage imageWithCGImage:quartzImage] retain];

       // Release the Quartz image
       CGImageRelease(quartzImage);

       return (image);
   }
}

这就是我的使用方式:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {

    CFRetain(sampleBuffer);
    dispatch_async(movieWritingQueue, ^{
    @autoreleasepool {

        if (self.returnCapturedImages && captureOutput != audioOutput) {

            UIImage *capturedImage = [self imageFromSampleBuffer: sampleBuffer];

            dispatch_async(callbackQueue, ^{

                @autoreleasepool {

                    if (self.delegate && [self.delegate respondsToSelector: @selector(recorderCapturedImage:)]) {
                        [self.delegate recorderCapturedImage: capturedImage];
                    }

                    [capturedImage release];
                }
            });
        }
        CFRelease(sampleBuffer);
    }
});

最佳答案

我找到了一个临时解决方案。我正在执行相同的操作,但在主队列上。这根本不是优雅或高效的,但是至少内存不再累积了。

我想知道这是否是iOS错误...?

更新:
这就是我在主线程上处理CMSampleBuffers的方式:

[[NSOperationQueue mainQueue] addOperationWithBlock:^ {

    CGImageRef cgImage = [self cgImageFromSampleBuffer:sampleBuffer];
    UIImage *capturedImage =     [UIImage imageWithCGImage: cgImage ];

    //do something with the image - I suggest in a background thread
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
       // do something with the image
    });

    CGImageRelease( cgImage );
    CFRelease(sampleBuffer);
}];

- (CGImageRef) cgImageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
    CGContextRelease(newContext);

    CGColorSpaceRelease(colorSpace);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    /* CVBufferRelease(imageBuffer); */  // do not call this!

    return newImage;
}

关于ios - 从CMSampleBufferRef创建UIImage时,iOS内存增加,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28218219/

相关文章:

ios - LazyImage 在 Auth0 SDK for iOS 中加载

ios - 在 iOS 上滑动打开无法完成所需的工作

c# - 查找 .NET 内存泄漏?

ios - 使用 Swift 以编程方式加载图像

ios - 在 uiimageview 上方绘制和移动图像 - Swift

ios - 将十六进制字符串转换为base64?

ios - 在 IOS 中触摸 View 时可以在触摸点上发光吗?

java - 如何解决android服务泄露错误

python - 将 XML 文件解析到 Google App Engine 数据存储区时的内存使用情况

ios - UIPageViewController 在我的 UIImage 的底部创建一个边框以及如何摆脱它