ios - 在 Swift3 iOS 中为捕获的视频添加滤镜效果后音频丢失

标签 ios swift avfoundation video-processing cifilter

我正在开发基于视频的应用程序,我需要将 CIFilter 添加到从设备库中选择的捕获视频中。为此,我使用下面的 VideoEffects 库:

https://github.com/FlexMonkey/VideoEffects

使用它,我可以为我的视频添加滤镜,但问题是最终视频输出中缺少音频。我尝试使用以下代码添加音频 Assets 但无法正常工作:

videoOutputURL = documentDirectory.appendingPathComponent("Output_\(timeDateFormatter.string(from: Date())).mp4")

    do {
      videoWriter = try AVAssetWriter(outputURL: videoOutputURL!, fileType: AVFileTypeMPEG4)
    }
    catch {
      fatalError("** unable to create asset writer **")
    }

    let outputSettings: [String : AnyObject] = [
      AVVideoCodecKey: AVVideoCodecH264 as AnyObject,
      AVVideoWidthKey: currentItem.presentationSize.width as AnyObject,
      AVVideoHeightKey: currentItem.presentationSize.height as AnyObject]

    guard videoWriter!.canApply(outputSettings: outputSettings, forMediaType: AVMediaTypeVideo) else {
      fatalError("** unable to apply video settings ** ")
    }


    videoWriterInput = AVAssetWriterInput(
      mediaType: AVMediaTypeVideo,
      outputSettings: outputSettings)


    //setup audio writer
    let audioOutputSettings: Dictionary<String, AnyObject> = [
        AVFormatIDKey : Int(kAudioFormatMPEG4AAC) as AnyObject,
        AVSampleRateKey:48000.0 as AnyObject,
        AVNumberOfChannelsKey:NSNumber(value: 1),
        AVEncoderBitRateKey : 128000 as AnyObject
    ]

    guard videoWriter!.canApply(outputSettings: audioOutputSettings, forMediaType: AVMediaTypeAudio) else {
        fatalError("** unable to apply Audio settings ** ")
    }

    audioWriterInput = AVAssetWriterInput(
        mediaType: AVMediaTypeAudio,
        outputSettings: audioOutputSettings)


    if videoWriter!.canAdd(videoWriterInput!) {
      videoWriter!.add(videoWriterInput!)
      videoWriter!.add(audioWriterInput!)
    }
    else {
      fatalError ("** unable to add input **")
    }

还有其他方法可以给视频加滤镜吗?请给我建议。

我还尝试使用 GPUImage 添加 CIFilter,但这仅适用于实时视频,不适用于捕获的视频。

最佳答案

从 iOS 9.0 开始,您可以使用 AVVideoComposition 将核心图像过滤器逐帧应用于视频。

let filter = CIFilter(name: "CIGaussianBlur")!
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
        // Clamp to avoid blurring transparent pixels at the image edges
        let source = request.sourceImage.imageByClampingToExtent()
        filter.setValue(source, forKey: kCIInputImageKey)

        // Vary filter parameters based on video timing
        let seconds = CMTimeGetSeconds(request.compositionTime)
        filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)

        // Crop the blurred output to the bounds of the original image
        let output = filter.outputImage!.imageByCroppingToRect(request.sourceImage.extent)

        request.finish(with: output, context: nil)
})

现在我们可以使用之前创建的 Assets 创建 AVPlayerItem 并使用 AVPlayer 播放它

let playerItem = AVPlayerItem(asset: asset)
playerItem.videoComposition = composition
let player = AVPlayer(playerItem: playerItem)
player.play()

核心图像过滤器逐帧添加实时。您还可以使用 AVAssetExportSession 类导出视频。

这是 WWDC 2015 的精彩介绍:Link

关于ios - 在 Swift3 iOS 中为捕获的视频添加滤镜效果后音频丢失,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49649194/

相关文章:

cocoa-touch - 合并音频和视频/图像以创建电影文件

ios - 在 Swift iOS 上返回经过处理的枚举字符串

ios - 约束问题

swift - AVAudioPlayerNode.Stop 上的 AVFoundation Mutex 互锁

arrays - 获取数组的中位数

ios - CallKit:调用UI控件

memory-leaks - 每当使用 AVSpeechSynthesizer 读取文本时,仪器都会报告内存泄漏

ios - DFP 问题 : No ad to show

ios - 在 SwiftUI 中创建背景模糊的 RoundedRectangle

iphone - 除了 Objective-C 之外,我还可以使用另一种语言来为 iOS 开发应用程序吗?