ios - 错误域=AVFoundationErrorDomain 代码=-11821 "Cannot Decode"

标签 ios objective-c iphone avfoundation

尝试将视频与 AVFoundation 合并时,我发现了一个奇怪的行为。我很确定我在某个地方犯了错误,但我太盲目了,看不到它。我的目标只是合并 4 个视频(稍后它们之间会有交叉淡入淡出过渡)。 每次我尝试导出视频时都会收到此错误:

Error Domain=AVFoundationErrorDomain Code=-11821 "Cannot Decode" UserInfo=0x7fd94073cc30 {NSLocalizedDescription=Cannot Decode, NSLocalizedFailureReason=The media data could not be decoded. It may be damaged.}

最有趣的是,如果我不为 AVAssetExportSession 提供 AVMutableVideoComposition,那么一切正常!我不明白我做错了什么。源视频是从 youtube 下载的,扩展名为 .mp4。我可以使用 MPMoviePlayerController 播放它们。在检查源代码时,请仔细查看 AVMutableVideoComposition。 我在 iOS 模拟器上的 Xcode 6.0.1 中测试了这段代码。

#import "VideoStitcher.h"
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>

@implementation VideoStitcher
{
    VideoStitcherCompletionBlock _completionBlock;
    AVMutableComposition *_composition;
    AVMutableVideoComposition *_videoComposition;
}

- (instancetype)init
{
    self = [super init];
    if (self)
    {
        _composition = [AVMutableComposition composition];
        _videoComposition = [AVMutableVideoComposition videoComposition];
    }
    return self;
}

- (void)compileVideoWithAssets:(NSArray *)assets completion:(VideoStitcherCompletionBlock)completion
{
    _completionBlock = [completion copy];

    if (assets == nil || assets.count < 2)
    {
        // We need at least two video to make a stitch, right?
        NSAssert(NO, @"VideoStitcher: assets parameter is nil or has not enough items in it");
    }
    else
    {
        [self composeAssets:assets];
        if (_composition != nil) // if stitching went good and no errors were found
            [self exportComposition];
    }
}

- (void)composeAssets:(NSArray *)assets
{
    AVMutableCompositionTrack *compositionVideoTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                     preferredTrackID:kCMPersistentTrackID_Invalid];

    NSError *compositionError = nil;
    CMTime currentTime = kCMTimeZero;
    AVAsset *asset = nil;
    for (int i = (int)assets.count - 1; i >= 0; i--) //For some reason videos are compiled in reverse order. Find the bug later. 06.10.14
    {
        asset = assets[i];
        AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
        BOOL success = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetVideoTrack.timeRange.duration)
                                                      ofTrack:assetVideoTrack
                                                       atTime:currentTime
                                                        error:&compositionError];
        if (success)
        {
            CMTimeAdd(currentTime, asset.duration);
        }
        else
        {
            NSLog(@"VideoStitcher: something went wrong during inserting time range in composition");
            if (compositionError != nil)
            {
                NSLog(@"%@", compositionError);
                _completionBlock(nil, compositionError);
                _composition = nil;
                return;
            }
        }
    }

    AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration);
    videoCompositionInstruction.backgroundColor = [[UIColor redColor] CGColor];
    _videoComposition.instructions = @[videoCompositionInstruction];
    _videoComposition.renderSize = [self calculateOptimalRenderSizeFromAssets:assets];
    _videoComposition.frameDuration = CMTimeMake(1, 600);
}

- (void)exportComposition
{
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:@"testVideo.mov"];
    NSURL *url = [NSURL fileURLWithPath:myPathDocs];


    NSString *filePath = [url path];
    NSFileManager *fileManager = [NSFileManager defaultManager];
    if ([fileManager fileExistsAtPath:filePath]) {
        NSError *error;
        if ([fileManager removeItemAtPath:filePath error:&error] == NO) {
            NSLog(@"removeItemAtPath %@ error:%@", filePath, error);
        }
    }

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:_composition
                                                                      presetName:AVAssetExportPreset1280x720];
    exporter.outputURL = url;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    exporter.videoComposition = _videoComposition;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        [self exportDidFinish:exporter];
    }];
}

- (void)exportDidFinish:(AVAssetExportSession*)session
{
    NSLog(@"%li", session.status);
    if (session.status == AVAssetExportSessionStatusCompleted)
    {
        NSURL *outputURL = session.outputURL;

        // time to call delegate methods, but for testing purposes we save the video in 'photos' app

        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])
        {
            [library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error){
                if (error == nil)
                {
                    NSLog(@"successfully saved video");
                }
                else
                {
                    NSLog(@"saving video failed.\n%@", error);
                }
            }];
        }
    }
    else if (session.status == AVAssetExportSessionStatusFailed)
    {
        NSLog(@"VideoStitcher: exporting failed.\n%@", session.error);
    }
}

- (CGSize)calculateOptimalRenderSizeFromAssets:(NSArray *)assets
{
    AVAsset *firstAsset = assets[0];
    AVAssetTrack *firstAssetVideoTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    CGFloat maxWidth = firstAssetVideoTrack.naturalSize.height;
    CGFloat maxHeight = firstAssetVideoTrack.naturalSize.width;

    for (AVAsset *asset in assets)
    {
        AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
        if (assetVideoTrack.naturalSize.width > maxWidth)
            maxWidth = assetVideoTrack.naturalSize.width;
        if (assetVideoTrack.naturalSize.height > maxHeight)
            maxHeight = assetVideoTrack.naturalSize.height;
    }

    return CGSizeMake(maxWidth, maxHeight);
}

@end

感谢您的关注。我真的很累,我已经连续四个小时试图找到这个错误。我要 sleep 了。

最佳答案

我终于找到了解决方案。错误描述将​​我引向错误的方向:“无法解码。无法解码媒体数据。它可能已损坏。”。根据此描述,您可能会认为您的视频文件有问题。我花了 5 个小时尝试格式、调试等。

嗯,答案完全不同!

我的错误是我忘记了 CMTimeADD() 返回值。我认为它改变了第一个参数的值,在代码中你可以看到:

CMTime currentTime = kCMTimeZero;
for (int i = (int)assets.count - 1; i >= 0; i--)
{
    CMTimeAdd(currentTime, asset.duration); //HERE!! I don't actually increment the value! currentTime is always kCMTimeZero
}
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration); // And that's where everything breaks!

我学到的教训:使用 AVFoundation 时,请始终检查您的时间值!这非常重要,否则你会遇到很多错误。

关于ios - 错误域=AVFoundationErrorDomain 代码=-11821 "Cannot Decode",我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/26209099/

相关文章:

ios - 标签栏未与UITabBarController一起出现

ios - 如何在 iOS、Swift 中搜索 iBeacons UUID,并扫描周围的所有 iBeacons

iphone - 将 JSON 数据放入 NSURL

iOS:以编程方式更改 UIView 的高度约束

ios - 动画旋转的 Sprite 环

iphone - 在 View 中添加多个标签

objective-c - 附加 CString 是否比附加 ObjC 字符串更好?

objective-c - 将对象从静态变量 Objective C 初始化为 Swift

iphone - 诊断自动释放错误(EXC_BAD_ACCESS)

iphone - UIGestureRecognizer 存储在 NSData 中