iOS AVFoundation 导出 session 缺少音频

标签 ios audio video export avfoundation

我正在使用 iOS AVFoundation 框架,我能够成功合并视频轨道、图像叠加和文本叠加。但是,我的输出文件并没有使原始源视频中的音频保持完整。

如何确保我的其中一个视频的音频源与我创建的新视频保持一致?

编辑

*使用此代码可以作为一个很好的示例来说明如何完成创建视频(带有原始音频)的过程。在使用 AVFoundation 处理视频时,我需要单独包含音轨对我来说并不明显。希望这对其他人有帮助。

    AVAssetTrack *videoTrack = nil;
    AVAssetTrack *audioTrack = nil;
    CMTime insertionPoint = kCMTimeZero;

    if([[url tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
        videoTrack = [url tracksWithMediaType:AVMediaTypeVideo][0];
    }

    if([[url tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
        audioTrack = [url tracksWithMediaType:AVMediaTypeAudio][0];
    }

    // Insert the video and audio tracks from AVAsset
    if (videoTrack != nil) {
        AVMutableCompositionTrack *compositionVideoTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:videoTrack atTime:insertionPoint error:&error];
    }
    if (audioTrack != nil) {
        AVMutableCompositionTrack *compositionAudioTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:audioTrack atTime:insertionPoint error:&error];
    }

最佳答案

这是解决这个问题的完整代码,它有两个视频与音频结合在一起:-

AVURLAsset* video1 = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:path1] options:nil];

AVURLAsset* video2 = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:path2] options:nil];

if (video1 !=nil && video2!=nil) {

    // 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
    AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
    // 2 - Video track

    AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *firstTrackAudio = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];

    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1.duration)
                        ofTrack:[[video1 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video2.duration)
                        ofTrack:[[video2 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:video1.duration error:nil];

//它有一个音轨

    if ([[video1 tracksWithMediaType:AVMediaTypeAudio] count] > 0)
    {
        AVAssetTrack *clipAudioTrack = [[video1 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [firstTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
    }

//它有一个音轨

    if ([[video2 tracksWithMediaType:AVMediaTypeAudio] count] > 0)
    {
        AVAssetTrack *clipAudioTrack = [[video2 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [firstTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, video2.duration) ofTrack:clipAudioTrack atTime:video1.duration error:nil];
    }

//导出 session

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                      presetName:AVAssetExportPresetHighestQuality];

    //Creates the path to export to  - Saving to temporary directory
    NSString* filename = [NSString stringWithFormat:@"Video_%d.mov",arc4random() % 1000];
    NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];

    //Checks if there is already a file at the output URL.  
    if ([[NSFileManager defaultManager] fileExistsAtPath:path])
    {
        NSLog(@"Removing item at path: %@", path);
        [[NSFileManager defaultManager] removeItemAtPath:path error:nil];
    }

    exporter.outputURL = [NSURL fileURLWithPath:path];
    //Set the output file type
    exporter.outputFileType = AVFileTypeQuickTimeMovie;


    path3=path;
    [arr_StoredDocumentoryUrls addObject:path3];

    //Exports!
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        switch (exporter.status) {
            case AVAssetExportSessionStatusCompleted:{
                NSLog(@"Export Complete");

                break;
            }
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Export Error: %@", [exporter.error description]);
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"Export Cancelled");
                break;
            default:
                break;
        }
    }];

}

关于iOS AVFoundation 导出 session 缺少音频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/14026584/

相关文章:

python - 如何从图像创建视频?

ios - 错误 : duplicate symbols for architecture armv7 after change Core Data Model

ios - Titanium - 在 Apple App store 上上传 iOS 应用程序

javascript - 为什么我在使用语音 RSS api 时遇到一些问题

video - FFMPEG concat 视频完成但视频丢失

javascript - 我想将当前时间和持续时间转换为百分比

ios - 如何使用 Swift 选择图像的一部分、裁剪并保存?

ios - 如何在 UILabel 中设置 2 行属性字符串

html - 使用开放图在 Facebook 上共享音频

android - 如何在当前应用程序上获得audioFocus?