ios - Mute Video + Transform Video 使用 AVAssetExportSession 的单一导出操作

标签 ios objective-c avfoundation avcomposition

我有以下代码来修复视频的转换

    - (AVVideoComposition *)squareVideoCompositionFor:(AVAsset *)asset {

    AVAssetTrack *track = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;

    CGFloat length = MAX(track.naturalSize.width, track.naturalSize.height);

    CGSize size = track.naturalSize;

    CGFloat scale = 0;

    CGAffineTransform transform = track.preferredTransform;

    if (transform.a == 0 && transform.b == 1 && transform.c == -1 && transform.d == 0) {
        scale = -1;
    }
    else if (transform.a == 0 && transform.b == -1 && transform.c == 1 && transform.d == 0) {
        scale = -1;
    }
    else if (transform.a == 1 && transform.b == 0 && transform.c == 0 && transform.d == 1) {
        scale = 1;
    }
    else if (transform.a == -1 && transform.b == 0 && transform.c == 0 && transform.d == -1) {
        scale = -1;
    }

    transform = CGAffineTransformTranslate(transform, scale * -(size.width - length) / 2, scale * -(size.height - length) / 2);




    AVMutableVideoCompositionLayerInstruction *transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:track];
    [transformer setTransform:transform atTime:kCMTimeZero];

//    CGAffineTransform finalTransform = t2;
//    [transformer setTransform:finalTransform atTime:kCMTimeZero];

    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, kCMTimePositiveInfinity);
    instruction.layerInstructions = @[transformer];


    AVMutableVideoComposition *composition = [AVMutableVideoComposition videoComposition];
    composition.frameDuration = CMTimeMake(1, 30);
    composition.renderSize =  CGSizeMake(length, length);
    composition.instructions = @[instruction];
    composition.renderScale = 1.0;


    return composition;
    }

下面的静音代码

- (AVMutableComposition *) removeAudioFromVideoFileFor:(AVAsset *)asset  {
    AVMutableComposition *composition_Mix = [AVMutableComposition composition];
    AVMutableCompositionTrack *compositionVideoTrack = [composition_Mix addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    BOOL ok = NO;

    AVAssetTrack * sourceVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    CMTimeRange x = CMTimeRangeMake(kCMTimeZero, [asset duration]);
    NSError *error;
    ok = [compositionVideoTrack insertTimeRange:x ofTrack:sourceVideoTrack atTime:kCMTimeZero error:&error];

    return composition_Mix;
}

函数的调用方式

    AVAsset *asset = [AVAsset assetWithURL:inputURL];

    AVMutableComposition *composition = [self  removeAudioFromVideoFileFor:asset];

    AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
    session.videoComposition = [self squareVideoCompositionFor:asset];
    session.outputURL = outputURL;
    session.outputFileType = AVFileTypeMPEG4;
    session.shouldOptimizeForNetworkUse = true;
    session.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);

但是如果我同时使用 composition[self squareVideoCompositionFor:asset] 会显示错误

Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}

如果我省略一个那么它工作正常意味着一个 AVAssetExportSession 可以使视频或 squareVideo 的音频静音

有没有一种方法可以使用 AVAssetExportSession 的单个导出进度来实现两者?

最佳答案

您的代码看起来不错,但我对您的代码进行了更改以使其正常工作。

inputURLoutputURL 应以 file://https:// 为前缀(因为它是 url,在你的情况下它应该以 file://)

开头

如果您的无效,那么您将无法获得所需的输出。

//FOR OUTPUT URL
NSString *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
path = [path stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLQueryAllowedCharacterSet]];

//the output video will be written to file final.mp4
NSURL *outputURL = [NSURL fileURLWithPath:path];
outputURL = [outputURL URLByAppendingPathComponent:@"final.mp4"];
NSLog(@"outputURL = %@", outputURL);


//FOR INPUT URL
//This is the path of the bundle resource that is going to be used
NSURL *inputURL = [[NSBundle mainBundle] URLForResource:@"video" withExtension:@"mp4"];
NSLog(@"inputURL = %@", inputURL);

导出合成

//this will export the composition with the specified configuration
[session exportAsynchronouslyWithCompletionHandler:^{
    NSLog(@"Success");
}];

当您在控制台中看到“成功”日志时,请检查您的应用程序的文档目录。视频将写入 outptURL

注意:使用 CMD + SHIFT + G 并粘贴输出 URL。您将被重定向到您的应用程序的文档文件夹(仅适用于模拟器)。对于设备,您需要下载应用容器并查看包内容。

完整代码

removeAudioFromVideoFileFor:squareVideoCompositionFor: 方法看起来不错。只需更改以下内容即可。

这里的“video”是app bundle中资源文件的名称。

- (void)viewDidLoad {
[super viewDidLoad];


   NSString *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
   path = [path stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLQueryAllowedCharacterSet]];
  NSURL *outputURL = [NSURL fileURLWithPath:path];
outputURL = [outputURL URLByAppendingPathComponent:@"final.mp4"];
  NSLog(@"outputURL = %@", outputURL);


  NSURL *inputURL = [[NSBundle mainBundle] URLForResource:@"video" withExtension:@"mp4"];
  NSLog(@"inputURL = %@", inputURL);

  AVAsset *asset = [AVAsset assetWithURL:inputURL];

  AVMutableComposition *composition = [self removeAudioFromVideoFileFor: asset];

  AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
  session.videoComposition = [self squareVideoCompositionFor:asset];
  session.outputURL = outputURL;
  session.outputFileType = AVFileTypeMPEG4;
  session.shouldOptimizeForNetworkUse = true;
  session.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);

  [session exportAsynchronouslyWithCompletionHandler:^{
     NSLog(@"Success:");
  }];
}

希望对你有帮助

关于ios - Mute Video + Transform Video 使用 AVAssetExportSession 的单一导出操作,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47013157/

相关文章:

ios - 澄清 iOS 上 OpenGL ES 2.0 中每个程序的纹理单元

ios - codepush 和 cocoapods 找不到 React/RCTEventEmitter.h

iphone - "Image view"的位置和大小不明确

iOS 通过本地通知显示特定的 View Controller

iphone - 裁剪 AVCaptureSession 捕获的图像

swift - 有没有办法将 CMSampleBuffer 转换为 CVImageBuffer?

iOS:如何拦截和操作 AVPlayer 中的字节

ios - Swift 2 - 传递给不带参数的调用的参数

ios - 如何在没有按钮的情况下以编程方式显示 SwiftUI View

objective-c - ParseKit 与我的 EOL 符号不匹配 : What am I doing wrong?