swift - 样本缓冲区的媒体类型必须匹配接收方的媒体类型 ("soun")

标签 swift avfoundation

基于这个答案https://stackoverflow.com/a/16035330/1615183我在 Swift 中创建了以下代码来压缩视频:

var videoWriter:AVAssetWriter!
var videoWriterInput:AVAssetWriterInput!
var processingQueue:dispatch_queue_t  = dispatch_queue_create("processingQueue1", nil)
var processingQueue2:dispatch_queue_t = dispatch_queue_create("processingQueue2", nil)
var audioWriterInput:AVAssetWriterInput!

func encode(){

    NSFileManager.defaultManager().removeItemAtURL(self.outputFile, error: nil)

    let videoCleanApertureSettings = [AVVideoCleanApertureHeightKey: 720,
        AVVideoCleanApertureWidthKey: 1280,
        AVVideoCleanApertureHorizontalOffsetKey: 2,
        AVVideoCleanApertureVerticalOffsetKey: 2
    ]
    let codecSettings  = [AVVideoAverageBitRateKey: 1024000,
        AVVideoCleanApertureKey: videoCleanApertureSettings
    ]

    let videoSettings = [AVVideoCodecKey: AVVideoCodecKey,
        AVVideoCompressionPropertiesKey: codecSettings,
        AVVideoHeightKey: 720, AVVideoWidthKey: 1280]


    //setup video writer
    var error:NSError?
    let asset = AVURLAsset(URL: self.inputFile, options: nil)

    let videoTrack:AVAssetTrack = asset.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
    let videoSize:CGSize = videoTrack.naturalSize

    videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
    videoWriterInput.expectsMediaDataInRealTime = false
    videoWriterInput.transform = videoTrack.preferredTransform
    videoWriter = AVAssetWriter(URL: self.outputFile, fileType: AVFileTypeQuickTimeMovie, error: &error)

    if videoWriter.canAddInput(videoWriterInput) {
        videoWriter.addInput(videoWriterInput)
    }else{
        println("cant add video writer input")
        return
    }

    //setup video reader

    let videoReaderSettings = [ kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]

    let videoReaderOutput:AVAssetReaderTrackOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings) // should it be videoReaderSettings?

    let videoReader:AVAssetReader = AVAssetReader(asset: asset, error: &error)
    if videoReader.canAddOutput(videoReaderOutput) {
        videoReader.addOutput(videoReaderOutput)
    }

    //setup audio writer
    audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil)

    audioWriterInput.expectsMediaDataInRealTime = false
    if videoWriter.canAddInput(audioWriterInput){
        videoWriter.addInput(audioWriterInput)
    }

    //setup audio reader
    let audioTrack:AVAssetTrack = asset.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
    let audioReaderOutput:AVAssetReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
    let audioReader:AVAssetReader = AVAssetReader(asset: asset, error: &error)

    if audioReader.canAddOutput(audioReaderOutput) {
        audioReader.addOutput(audioReaderOutput)
    }else {
        println("cant add audio reader")
        return
    }


    videoWriter.startWriting()
    videoReader.startReading()

    videoWriter.startSessionAtSourceTime(kCMTimeZero)




    videoWriterInput.requestMediaDataWhenReadyOnQueue(processingQueue) {
        while self.videoWriterInput.readyForMoreMediaData {
            println("First loop")
            var sampleBuffer = videoReaderOutput.copyNextSampleBuffer()
            if videoReader.status == .Reading && sampleBuffer != nil {
                println("Appending")
                self.videoWriterInput.appendSampleBuffer(sampleBuffer)
            }else {
                self.videoWriterInput.markAsFinished()
                if videoReader.status == .Completed {

                    audioReader.startReading()
                    self.videoWriter.startSessionAtSourceTime(kCMTimeZero)

                    self.audioWriterInput.requestMediaDataWhenReadyOnQueue(self.processingQueue2) {
                        while self.audioWriterInput.readyForMoreMediaData {
                            println("Second loop")
                            var sampleBuffer2:CMSampleBufferRef? = audioReaderOutput.copyNextSampleBuffer()
                            if audioReader.status == .Reading && sampleBuffer2 != nil {
                                self.audioWriterInput.appendSampleBuffer(sampleBuffer2)
                            }else {
                                self.audioWriterInput.markAsFinished()
                                println("Audio finish")
                                self.videoWriter.finishWritingWithCompletionHandler { println("Done") }
                            }
                        }

                    }


                }
                else {
                    println("Video Reader not completed")
                }
                println("Finished")
                break
            }// else vidoSampleBuffer
        }
    }

 }

但是,如果我删除音频部分,我只会得到一个空文件。如果我在第一次运行第二个循环时按原样运行它,则没有问题,但在第二次迭代时它会崩溃并出现以下错误:

*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriterInput appendSampleBuffer:] Media type of sample buffer must match receiver's media type ("soun")'

有人遇到同样的问题吗?

最佳答案

AVMediaTypeVideo 更改为 Audio:

let audioTrack:AVAssetTrack = asset.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack

应该是

let audioTrack:AVAssetTrack = asset.tracksWithMediaType(AVMediaTypeAudio)[0] as AVAssetTrack

关于swift - 样本缓冲区的媒体类型必须匹配接收方的媒体类型 ("soun"),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/29283536/

相关文章:

swift - 如何使用程序集中定义的多个 Storyboard?

ios - 如何在 Swift 中禁用/启用 AVPlayer 的快进功能

ios - 尝试无缝循环 AVPlayer (userCapturedVideo) 时出现问题

swift - 识别 TextView 中不可见的字符

ios - 为什么我们不用委托(delegate)对象来实例化委托(delegate)类呢?

Swift 协议(protocol)继承

ios - 合并两个音频文件失败

ios - 在TableView中播放不同的声音

ios - 将音频转换为 [Double] 类型的原始 PCM 数据

android - 从iOS Swift代码显示FlutterView时出现问题