iOS-将图像制作为视频时,CVPixelBufferCreate 内存无法正确释放

标签 ios objective-c xcode memory

我正在将图像制作成视频。但总是因为内存警告而崩溃,CVPixelBufferCreate 分配过多。我不知道如何正确处理它。我看过很多类似的主题,但没有一个能解决我的问题。

enter image description here

这是我的代码:

- (void) writeImagesArray:(NSArray*)array asMovie:(NSString*)path
{
    NSError *error  = nil;
    UIImage *first = [array objectAtIndex:0];
    CGSize frameSize = first.size;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                              error:&error];
    NSParameterAssert(videoWriter);
    
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithDouble:frameSize.width],AVVideoWidthKey,
                                   [NSNumber numberWithDouble:frameSize.height], AVVideoHeightKey,
                                   nil];
    
    AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                       assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:videoSettings];
    
    self.adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];
    
    [videoWriter addInput:writerInput];
    
    //Start Session
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];
    
    int frameCount = 0;
    CVPixelBufferRef buffer = NULL;
    for(UIImage *img in array)
    {
        buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize];
        if (self.adaptor.assetWriterInput.readyForMoreMediaData)
        {
            CMTime frameTime =  CMTimeMake(frameCount,FPS);
            [self.adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
        }
        if(buffer)
            CVPixelBufferRelease(buffer);
        
        frameCount++;
    }
    
    [writerInput markAsFinished];
    [videoWriter finishWritingWithCompletionHandler:^{
        
        if (videoWriter.status == AVAssetWriterStatusFailed) {
            
            NSLog(@"Movie save failed.");
            
        }else{
            
            NSLog(@"Movie saved.");
        }
    }];
    
    NSLog(@"Finished.");
}

        
- (CVPixelBufferRef)newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    
    CVPixelBufferRef pxbuffer = NULL;
    
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          frameSize.width,
                                          frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    
    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);
    
    CGBitmapInfo bitmapInfo = (CGBitmapInfo) kCGImageAlphaNoneSkipFirst;
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata,
                                                 frameSize.width,
                                                 frameSize.height,
                                                 8,
                                                 4*frameSize.width,
                                                 rgbColorSpace,
                                                 bitmapInfo);
    
    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

更新:

我将我的视频分成小段。 添加 [NSThread sleepForTimeInterval:0.00005] 后;在循环。 内存刚刚神奇地释放。

但是,这导致我的用户界面因为这一行而卡住了几秒钟。有更好的解决方案吗?

for(UIImage *img in array)
{
    buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize];
    //CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, adaptor.pixelBufferPool, &buffer);
    if (adaptor.assetWriterInput.readyForMoreMediaData)
    {
        CMTime frameTime =  CMTimeMake(frameCount,FPS);
        [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
    }
    
    if(buffer)
        CVPixelBufferRelease(buffer);
    
    frameCount++;
    
    [NSThread sleepForTimeInterval:0.00005];
}

这是内存:

enter image description here

最佳答案

通过快速查看您的代码,我看不出 CVBuffer 本身的管理有任何问题。
我认为这可能是您问题的根源是 UIImages 数组。
UIImage 具有此行为,直到您请求 CGImage 属性或绘制它,附加图像不会在内存中解码,因此未使用图像对内存的影响很小。
您的枚举在每个图像上调用 CGImage 属性并且您永远不会摆脱它们,这可以解释内存分配的持续增加。

关于iOS-将图像制作为视频时,CVPixelBufferCreate 内存无法正确释放,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28409041/

相关文章:

ios - 无法使 UIAlertView 调用委托(delegate)方法 - 部署目标 iOS7

iOS:在文档应用程序中下载并保存 PDF 文件

ios - ReactiveCocoa : Immediately send the latest value when there's a new subscription

iOS - 在 LaunchScreen 后设置第一个屏幕

ios - UIcollectionView Reloaddata 在显示实际数据之前显示其下方行中的一些随机数据几分之一秒

ios - Apple单例代码警告......再次

ios - 在 SwiftUI 中更改 ProgressView 的大小(高度)

ios - 如何在 TableView 自定义期间调用方法 "willDisplayFooterView"?

ios - 关闭 SKScene 后,内存仍然很高

ios - 应用程序在后台运行时的位置跟踪