ios - 如何播放从 NSData 转换而来的 AVAudioPCMBuffer 中的音频

标签 ios objective-c xcode swift

我从像这样的 udp 数据包中获取音频 PCM 16 位单声道数据:

(void)udpSocket:(GCDAsyncUdpSocket *)sock didReceiveData:(NSData *)data
                                               fromAddress:(NSData *)address
                                         withFilterContext:(id)filterContext
{
...
}

我正在通过如下调用 swift 函数将此数据转换为 PCM 缓冲区:

func toPCMBuffer(data: NSData) -> AVAudioPCMBuffer {
    let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)  // given NSData audio format
    var PCMBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity:1024*10)
    PCMBuffer.frameLength = PCMBuffer.frameCapacity

    let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: Int(PCMBuffer.format.channelCount))

    data.getBytes(UnsafeMutablePointer<Void>(channels[0]) , length: data.length)

    return PCMBuffer
}

数据被转换为 PCM 缓冲区,我可以在日志中看到它的长度。 但是当我尝试播放缓冲区时,我听不到任何声音。 这是接收代码:

func toPCMBuffer(data: NSData) -> AVAudioPCMBuffer {
        let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)  // given NSData audio format
        var PCMBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity:1024*10)
        PCMBuffer.frameLength = PCMBuffer.frameCapacity

        let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: Int(PCMBuffer.format.channelCount))

        data.getBytes(UnsafeMutablePointer<Void>(channels[0]) , length: data.length)
        var mainMixer = audioEngine.mainMixerNode
        audioEngine.attachNode(audioFilePlayer)
        audioEngine.connect(audioFilePlayer, to:mainMixer, format: PCMBuffer.format)
        audioEngine.startAndReturnError(nil)

        audioFilePlayer.play()
        audioFilePlayer.scheduleBuffer(PCMBuffer, atTime: nil, options: nil, completionHandler: nil)
        return PCMBuffer
    }

最佳答案

最终使用了 objective-c 函数:数据转换正常

-(AudioBufferList *) getBufferListFromData: (NSData *) data
{
    if (data.length > 0)
    {
        NSUInteger len = [data length];
        //NSData *d2 = [data subdataWithRange:NSMakeRange(4, 1028)];
        //I guess you can use Byte*, void* or Float32*. I am not sure if that makes any difference.
        Byte* byteData = (Byte*) malloc (len);
        memcpy (byteData, [data bytes], len);
        if (byteData)
        {
            AudioBufferList * theDataBuffer =(AudioBufferList*)malloc(sizeof(AudioBufferList) * 1);
            theDataBuffer->mNumberBuffers = 1;
            theDataBuffer->mBuffers[0].mDataByteSize =(UInt32) len;
            theDataBuffer->mBuffers[0].mNumberChannels = 1;
            theDataBuffer->mBuffers[0].mData = byteData;
            // Read the data into an AudioBufferList
            return theDataBuffer;
        }
    }
    return nil;
}

关于ios - 如何播放从 NSData 转换而来的 AVAudioPCMBuffer 中的音频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31423790/

相关文章:

objective-c - 如何有效地从 1 到 50 中选择几个唯一的随机数,不包括 x?

ios - 在给定坐标附近,在 Swift、MapKit 和 corelocation 中打印 hello

ios - Xcode 测试用例显示没有测试覆盖率

iphone - Objective C 贝塞尔曲线 reshape

ios - 印象笔记 session authenticateWithViewController :completionHandler: does not trigger completionHandler

objective-c - 在 AVAssetReader 中设置时间范围会导致卡住

ios - 一次向核心 - 数据关系添加多个操作

ios - 无法将 apple developer_identity.cer 转换为 .p12 格式。没有证书与私钥匹配

ios - 分配一个新的带有 nil 属性的 swift 类对象返回我 nil 对象

ios - 我可以在通知中传递手势识别器吗