我想在 iOS 上实时录制音频,分析原始音频数据并保存部分录制数据。我正在使用以下代码记录数据:https://gist.github.com/hotpaw2/ba815fc23b5d642705f2b1dedfaf0107
现在,我的数据保存在一个 Float 数组中,我想将它保存到一个音频文件中。我试着用这段代码来做:
let fileMgr = FileManager.default
let dirPaths = fileMgr.urls(for: .documentDirectory, in: .userDomainMask)
let recordSettings = [AVEncoderAudioQualityKey: AVAudioQuality.min.rawValue,
AVEncoderBitRateKey: 16,
AVNumberOfChannelsKey: 2,
AVSampleRateKey: 44100] as [String: Any]
let soundFileUrl = dirPaths[0].appendingPathComponent("recording-" + getDate() + ".pcm")
do {
let audioFile = try AVAudioFile(forWriting: soundFileUrl, settings: recordSettings)
let format = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44100, channels: 2, interleaved: true)
let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 3000)
for i in 0..<circBuffer.count {
audioFileBuffer.int16ChannelData?.pointee[i] = Int16(circBuffer[i])
}
try audioFile.write(from: audioFileBuffer)
}
在最后一行,我收到一条错误消息:
错误:>avae> AVAudioFile.mm:306: -[AVAudioFile writeFromBuffer:error:]: error -50 amplitudeDemo(79264,0x70000f7cb000) malloc:* 对象 0x7fc5b9057e00 错误:释放对象的校验和不正确 - 对象可能在释放后被修改。 * 在malloc_error_break 设置断点调试
我已经搜索了很多其他问题,但找不到任何对我有帮助的问题。
最佳答案
在您的代码中这一行:
let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 3000)
您声明一个容量为 3000 帧 * 2 个 channel 的 AVAudioPCMBuffer
,这意味着为您的 audioFileBuffer
分配的缓冲区可以包含 6000 个样本。如果 channel 数据的索引超过此限制,您的代码会破坏堆中的附近区域,这会导致对象可能已被修改错误。
因此,您的 circBuffer.count
可能超过了这个限制。您需要为 AVAudioPCMBuffer
分配足够的缓冲区大小。
do {
//### You need to specify common format
let audioFile = try AVAudioFile(forWriting: soundFileUrl, settings: recordSettings, commonFormat: .pcmFormatInt16, interleaved: true)
let channels = 2
let format = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44100, channels: AVAudioChannelCount(channels), interleaved: true)
let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(circBuffer.count / channels)) //<-allocate enough frames
//### `stride` removed as it seems useless...
let int16ChannelData = audioFileBuffer.int16ChannelData! //<-cannot be nil for the `format` above
//When interleaved, channel data of AVAudioPCMBuffer is not as described in its doc:
// https://developer.apple.com/reference/avfoundation/avaudiopcmbuffer/1386212-floatchanneldata .
// The following code is modified to work with actual AVAudioPCMBuffer.
//Assuming `circBuffer` as
// Interleaved, 2 channel, Float32
// Each sample is normalized to [-1.0, 1.0] (as usual floating point audio format)
for i in 0..<circBuffer.count {
int16ChannelData[0][i] = Int16(circBuffer[i] * Float(Int16.max))
}
//You need to update `frameLength` of the `AVAudioPCMBuffer`.
audioFileBuffer.frameLength = AVAudioFrameCount(circBuffer.count / channels)
try audioFile.write(from: audioFileBuffer)
} catch {
print("Error", error)
}
一些注释添加为注释,请在尝试此代码之前检查它们。
更新 抱歉,为了显示未经测试的代码,修复了两件事:
- 实例化
AVAudioFile
时需要指定commonFormat:
。 int16ChannelData
(和其他 channel 数据)在交错时不返回其文档中所述的预期指针,修改数据填充循环以适应实际行为。
请尝试。
关于iOS Swift malloc 错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41487882/