objective-c - AVAssetWriterInputPixelBufferAdaptor 和 CMTime

标签 objective-c ios avfoundation avassetwriter cmtime

我正在使用 AVAssetWriterInputPixelBufferAdaptor 将一些帧写入视频,并且行为 w.r.t.时间不是我所期望的。

如果我只写一帧:

 [videoWriter startSessionAtSourceTime:kCMTimeZero];
 [adaptor appendPixelBuffer:pxBuffer withPresentationTime:kCMTimeZero];

这让我得到了一个长度为零的视频,这正是我所期望的。

但如果我继续添加第二帧:

 // 3000/600 = 5 sec, right?
 CMTime nextFrame = CMTimeMake(3000, 600); 
 [adaptor appendPixelBuffer:pxBuffer withPresentationTime:nextFrame];

我得到了 10 秒的视频,而我期望是 5 秒。

这是怎么回事? withPresentationTime 是否以某种方式设置帧的开始和持续时间?

请注意,我没有调用 endSessionAtSourceTime,只是调用 finishWriting

最佳答案

尝试查看此示例并进行逆向工程以在 5 秒后添加 1 帧...

这里是示例代码链接:git@github.com:RudyAramayo/AVAssetWriterInputPixelBufferAdaptorSample.git

这是您需要的代码:

- (void) testCompressionSession
{
CGSize size = CGSizeMake(480, 320);


NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];

NSError *error = nil;

unlink([betaCompressionDirectory UTF8String]);

//----initialize compression engine
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
                                                       fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];
NSParameterAssert(videoWriter);
if(error)
    NSLog(@"error = %@", [error localizedDescription]);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
                                                       [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                 sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);

if ([videoWriter canAddInput:writerInput])
    NSLog(@"I can add this input");
else
    NSLog(@"i can't add this input");

[videoWriter addInput:writerInput];

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

//---
// insert demo debugging code to write the same image repeated as a movie

CGImageRef theImage = [[UIImage imageNamed:@"Lotus.png"] CGImage];

dispatch_queue_t    dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block         frame = 0;

[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
    while ([writerInput isReadyForMoreMediaData])
    {
        if(++frame >= 120)
        {
            [writerInput markAsFinished];
            [videoWriter finishWriting];
            [videoWriter release];
            break;
        }

        CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size];
        if (buffer)
        {
            if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)])
                NSLog(@"FAIL");
            else
                NSLog(@"Success:%d", frame);
            CFRelease(buffer);
        }
    }
}];

NSLog(@"outside for loop");

}


- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
// CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);

NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);

CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);

CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;
}

关于objective-c - AVAssetWriterInputPixelBufferAdaptor 和 CMTime,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/5808557/

相关文章:

iphone - 如何优化quartz 2d?

ios - Swift iOS - 如何压缩存储的嵌入视频文件大小?

iphone - 删除后的核心数据在保存时抛出错误

swift - 无法使用 AVFoundation 将 JPEG 图像写入 HDD

iphone - AVCaptureStillImageOutput 输出设置内存泄漏

iphone - 在 iPhone 上横向全屏播放 YouTube,纵向内嵌播放 YouTube

iphone - Availability.h 类宏

objective-c - iOS 钥匙串(keychain),存储更多 kSecValueData 项

ios - iOS 应用程序中的点赞按钮

ios - 获取 20 毫秒的麦克风数据回调 VoIP 应用程序