ios - 后台 iOS 视频合并

标签 ios objective-c iphone video video-processing

任务:将传单图像合并到传单视频中。

案例:

  • 创建传单[添加表情符号图像/文本..等]
  • 制作视频

案例一

  • 按下后退按钮[用户将转到传单屏幕的应用程序列表],在此期间我们将 flyerSnapShoot 合并到 flyerVideo 中。它完美运行
  • 转到 Phone Gallery,我们会在其中看到更新的视频。

案例2

  • 按 iPhone 主页按钮,我正在做与上面相同的事情,但面临以下错误

FAIL = Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17266d40 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x172b3920 "The operation couldn’t be completed. (OSStatus error -16980.)", NSLocalizedFailureReason=An unknown error occurred (-16980)}

代码:

- (void)modifyVideo:(NSURL *)src destination:(NSURL *)dest crop:(CGRect)crop
              scale:(CGFloat)scale overlay:(UIImage *)image
         completion:(void (^)(NSInteger, NSError *))callback {

    // Get a pointer to the asset
    AVURLAsset* firstAsset = [AVURLAsset URLAssetWithURL:src options:nil];

    // Make an instance of avmutablecomposition so that we can edit this asset:
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    // Add tracks to this composition
    AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    // Audio track
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    // Image video is always 30 seconds. So we use that unless the background video is smaller.
    CMTime inTime = CMTimeMake( MAX_VIDEO_LENGTH * VIDEOFRAME, VIDEOFRAME );
    if ( CMTimeCompare( firstAsset.duration, inTime ) < 0 ) {
        inTime = firstAsset.duration;
    }

    // Add to the video track.
    NSArray *videos = [firstAsset tracksWithMediaType:AVMediaTypeVideo];
    CGAffineTransform transform;
    if ( videos.count > 0 ) {
        AVAssetTrack *track = [videos objectAtIndex:0];
        [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:track atTime:kCMTimeZero error:nil];
        transform = track.preferredTransform;
        videoTrack.preferredTransform = transform;
    }

    // Add the audio track.
    NSArray *audios = [firstAsset tracksWithMediaType:AVMediaTypeAudio];
    if ( audios.count > 0 ) {
        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:[audios objectAtIndex:0] atTime:kCMTimeZero error:nil];
    }

    NSLog(@"Natural size: %.2f x %.2f", videoTrack.naturalSize.width, videoTrack.naturalSize.height);

    // Set the mix composition size.
    mixComposition.naturalSize = crop.size;

    // Set up the composition parameters.
    AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
    videoComposition.frameDuration = CMTimeMake(1, VIDEOFRAME );
    videoComposition.renderSize = crop.size;
    videoComposition.renderScale = 1.0;

    // Pass through parameters for animation.
    AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, inTime);

    // Layer instructions
    AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

    // Set the transform to maintain orientation
    if ( scale != 1.0 ) {
        CGAffineTransform scaleTransform = CGAffineTransformMakeScale( scale, scale);
        CGAffineTransform translateTransform = CGAffineTransformTranslate( CGAffineTransformIdentity,
                                                                          -crop.origin.x,
                                                                          -crop.origin.y);
        transform = CGAffineTransformConcat( transform, scaleTransform );
        transform = CGAffineTransformConcat( transform, translateTransform);
    }

    [passThroughLayer setTransform:transform atTime:kCMTimeZero];

    passThroughInstruction.layerInstructions = @[ passThroughLayer ];
    videoComposition.instructions = @[passThroughInstruction];

    // If an image is given, then put that in the animation.
    if ( image != nil ) {

        // Layer that merges the video and image
        CALayer *parentLayer = [CALayer layer];
        parentLayer.frame = CGRectMake( 0, 0, crop.size.width, crop.size.height);

        // Layer that renders the video.
        CALayer *videoLayer = [CALayer layer];
        videoLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
        [parentLayer addSublayer:videoLayer];

        // Layer that renders flyerly image.
        CALayer *imageLayer = [CALayer layer];
        imageLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
        imageLayer.contents = (id)image.CGImage;
        [imageLayer setMasksToBounds:YES];

        [parentLayer addSublayer:imageLayer];

        // Setup the animation tool
        videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
    }

    // Now export the movie
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
    exportSession.videoComposition = videoComposition;

    // Export the URL
    exportSession.outputURL = dest;
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    exportSession.shouldOptimizeForNetworkUse = YES;
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        callback( exportSession.status, exportSession.error );
    }];
}

我从 AppDelegate.m 调用这个函数

- (void)applicationDidEnterBackground:(UIApplication *)application
{
    bgTask = [application beginBackgroundTaskWithName:@"MyTask" expirationHandler:^{
        // Clean up any unfinished task business by marking where you
        // stopped or ending the task outright.
        [application endBackgroundTask:bgTask];
        bgTask = UIBackgroundTaskInvalid;
    }];

    // Start the long-running task and return immediately.
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{

        // Do the work associated with the task, preferably in chunks.
         [self goingToBg];

        [application endBackgroundTask:bgTask];
        bgTask = UIBackgroundTaskInvalid;
    });

    NSLog(@"backgroundTimeRemaining: %f", [[UIApplication sharedApplication] backgroundTimeRemaining]);
}

最佳答案

在这个问题上做了很多 RND,没有找到解决方案。

想分享几个链接,希望如果他们遇到同样的问题[要求],它会对堆栈社区有所帮助。

链接 1:AVExportSession to run in background

与问题相关的引用[从上面的 Link1 复制]

Sadly, since AVAssetExportSession uses the gpu to do some of it's work, it cannot run in the background if you are using an AVVideoComposition.

链接 2:Starting AVAssetExportSession in the Background

与问题相关的引用[从上面的 Link2 复制]

You can start AVAssetExportSession in background. The only limitations in AVFoundation to performing work in the background, are using AVVideoCompositions or AVMutableVideoCompositions. AVVideoCompositions are using the GPU, and the GPU cannot be used in the background

后台任务的网址:

APPLE DEV URL

RAYWENDERLICH URL

Stack question

关于ios - 后台 iOS 视频合并,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28694975/

相关文章:

ios - 为什么我的应用程序在从 NSUserDefaults 加载数组时崩溃?

ios - Xcode 6.3 找不到 watchkit 扩展 swift 文件

objective-c - NSTableView的行号

objective-c - NSImage imageNamed : for Mac Mini returns small icon instead of high-res

objective-c - cocoa objective-c for os x : get volume mount point from path

iphone - 视频模式下录制的快门声(叮)

ios - 滚动后可重用的UITableViewCells更改

iphone - Xcode 'Build and Archive' 菜单项已禁用

iphone - 如何通过 XML 解析从 Google 新闻中获取数据

ios - 在 iOS 游戏中使用 Parse.com