swift - 从 Swift 中的 unSafeMutablePointer Int16 获取音频数据的值(value)

标签 swift audio avassetreader unsafe-pointers cmsamplebuffer

我正在努力将这段代码转换为 Swift,这有助于我获取用于可视化的音频数据。我在 Obj C 中使用的代码运行良好,是:

    while (reader.status == AVAssetReaderStatusReading) {
           AVAssetReaderTrackOutput *trackOutput = (AVAssetReaderTrackOutput *)[reader.outputs objectAtIndex:0];
            self.sampleBufferRef = [trackOutput copyNextSampleBuffer];
            if (self.sampleBufferRef) {

                CMBlockBufferRef blockBufferRef = CMSampleBufferGetDataBuffer(self.sampleBufferRef);
                size_t bufferLength = CMBlockBufferGetDataLength(blockBufferRef);
                void *data = malloc(bufferLength);
                CMBlockBufferCopyDataBytes(blockBufferRef, 0, bufferLength, data);

                SInt16 *samples = (SInt16 *)data;
                int sampleCount = bufferLength / bytesPerInputSample;


                for (int i=0; i<sampleCount; i+=100) {
                    Float32 sample = (Float32) *samples++;

                sample = decibel(sample);
                sample = minMaxX(sample,noiseFloor,0);
                tally += sample; 

                for (int j=1; j<channelCount; j++)
                    samples++;
                tallyCount++;

                if (tallyCount == downsampleFactor) {
                    sample = tally / tallyCount;
                    maximum = maximum > sample ? maximum : sample;
                    [fullSongData appendBytes:&sample length:sizeof(sample)];//tried dividing the sample by 2
                    tally = 0;
                    tallyCount = 0;
                    outSamples++;


                }
            }

        CMSampleBufferInvalidate(self.sampleBufferRef);
        CFRelease(self.sampleBufferRef);
        free(data);
   }
}

在 Swift 中,我试图写的是这部分:

 while (reader.status == AVAssetReaderStatus.Reading) {
            var trackOutput = reader.outputs[0] as! AVAssetReaderTrackOutput
            self.sampleBufferRef = trackOutput.copyNextSampleBuffer()

            if (self.sampleBufferRef != nil) {

            let blockBufferRef = CMSampleBufferGetDataBuffer(self.sampleBufferRef)
            let bufferLength = CMBlockBufferGetDataLength(blockBufferRef)
            var data = NSMutableData(length: bufferLength)
            CMBlockBufferCopyDataBytes(blockBufferRef, 0, bufferLength, data!.mutableBytes)


            var samples = UnsafeMutablePointer<Int16>(data!.mutableBytes)

            var sampleCount = Int32(bufferLength)/bytesPerInputSample


            for var i = 0; i < Int(sampleCount); i++ {

                var sampleValue = CGFloat(samples[i]) etc. etc.

但是,当我在控制台中打印 println() sampleValue 时 (Opaque Value)。我不知道如何实际读取 sampleValue。

我刚开始尝试读取音频数据以进行可视化。任何有关获取音频数据缓冲区的帮助都会有所帮助。谢谢。

最佳答案

使用步幅?

let bytesPerInputSample = 4 // assumption ;)

var samplePtr = data.mutableBytes

for _ in stride(from: 0, to: data.length, by: bytesPerInputSample) {
    let currentSample = Data(bytes: samplePtr, count: bytesPerInputSample)
    // do whatever is needed with current sample

    //...

    // increase ptr by size of sample
    samplePtr = samplePtr + bytesPerInputSample
}

关于swift - 从 Swift 中的 unSafeMutablePointer Int16 获取音频数据的值(value),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31642480/

相关文章:

ios - 表达式类型不明确,在实现解析注册时没有更多上下文错误

Swift 沙盒用户在应用内购买循环

android - Android-无法在真实计算机上从SD卡读取TXT文件?

objective-c - iPod 库中的 .m4a 原始数据无法播放

ios - 如何通过 AVAssetReader 播放声音

ios - 没有完全理解 UIApplication.shared.canOpenUrl

ios - Swift - 调整 fontSize 以适应布局的宽度(以编程方式)

javascript - 当特定图像接触到其他图像时不触发音频

merge - 如何以编程方式将多个音频轨道混合为一个轨道?

iOS 从相机中实时捕获视频并与音频文件混合