我正在尝试将来自麦克风的音频数据记录到 .wav
文件中并进行播放。我还需要实际数据(振幅)来绘制图形,所以我使用的是 AudioUnit
。我为 AudioUnit
对象设置了 inputCallBack 和 renderCallBack 但我不知道如何从 render_CallBack
方法将 AudioBuffers 写入 .wav
文件。我附上了我到目前为止尝试过的代码。
请帮帮我...
步骤 - 1
AudioStreamBasicDescription audioStreamBasicDesc;
AudioUnit.AudioUnit audioUnit;
string m_recordingFilePath;
ExtAudioFile extAudioFileObj;
public override void ViewDidLoad()
{
base.ViewDidLoad();
audioStreamBasicDesc.SampleRate = 16000;
audioStreamBasicDesc.Format = AudioFormatType.LinearPCM;
audioStreamBasicDesc.FramesPerPacket = 1;
audioStreamBasicDesc.ChannelsPerFrame = 1;
audioStreamBasicDesc.BytesPerFrame =
audioStreamBasicDesc.ChannelsPerFrame * sizeof(short);
audioStreamBasicDesc.BytesPerPacket =
audioStreamBasicDesc.ChannelsPerFrame * sizeof(short);
audioStreamBasicDesc.BitsPerChannel = 16;
audioStreamBasicDesc.Reserved = 0;
audioStreamBasicDesc.FormatFlags = AudioFormatFlags.IsSignedInteger |
AudioFormatFlags.IsPacked;
var documents = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
var tmp = Path.Combine(documents, "..", "tmp");
m_recordingFilePath = Path.Combine(tmp,
String.Format("{0}.wav",
"MyFile" + DateTime.Now.ToString("MM-dd-yyyy HH-mm-ss",
CultureInfo.InvariantCulture)));
extAudioFileObj = ExtAudioFile.CreateWithUrl(CFUrl.FromFile(m_recordingFilePath),
AudioFileType.WAVE,
audioStreamBasicDesc,
AudioFileFlags.EraseFlags);
prepareAudioUnit();
}
步骤 - 2
public void prepareAudioUnit()
{
var _audioComponent = AudioComponent.FindComponent(AudioTypeOutput.Remote);
audioUnit = _audioComponent.CreateAudioUnit();
audioUnit = new AudioUnit.AudioUnit(_audioComponent);
audioUnit.SetEnableIO(true,
AudioUnitScopeType.Input,
1 // Remote Input
);
// setting audio format
audioUnit.SetAudioFormat(audioStreamBasicDesc,
AudioUnitScopeType.Output,
1
);
audioUnit.SetInputCallback(input_CallBack, AudioUnitScopeType.Input, 1);
audioUnit.SetRenderCallback(render_CallBack, AudioUnitScopeType.Global, 0);
audioUnit.Initialize();
audioUnit.Start();
}
步骤 - 3
AudioUnitStatus input_CallBack(AudioUnitRenderActionFlags actionFlags,
AudioTimeStamp timeStamp,
uint busNumber,
uint numberFrames,
AudioUnit.AudioUnit audioUnit)
{
return AudioUnitStatus.NoError;
}
步骤 - 4
AudioUnitStatus render_CallBack(AudioUnitRenderActionFlags actionFlags,
AudioTimeStamp timeStamp,
uint busNumber,
uint numberFrames,
AudioBuffers data)
{
// getting microphone input signal
var status = audioUnit.Render(ref actionFlags,
timeStamp,
1, // Remote input
numberFrames,
data);
if (status != AudioUnitStatus.OK)
{
return status;
}
//get pointer to buffer
var outP = data[0].Data;
unsafe
{
var outPtr = (int*)outP.ToPointer();
for (int i = 0; i < numberFrames; i++)
{
var val = *outPtr;
outPtr++;
//lastestPickVal = val; //This is for ploting graph
Console.WriteLine(val);
}
}
extAudioFileObj.ClientDataFormat = audioStreamBasicDesc;
//Here i am trying to write data into .wav file and file is generated also
//but corrupted file without actual data (create file size is approx 4kb or 100 kb )
var err = extAudioFileObj.Write(numberFrames, data);
Console.WriteLine("OUTPUT" + busNumber);
return AudioUnitStatus.NoError;
}
最佳答案
前段时间我为 Xamarin 写了一个 IAudioStream 抽象,它可能对你没什么帮助。它从 AudioQueueBuffer 获取字节缓冲区,您可能正在寻找的是将缓冲区编码为字节:
还有一个 WAV 录音器类,它连接到源并将其写入 WAV,但这是在提取原始信号之后:
我希望这些至少能给你一些帮助。
关于ios - 将来自麦克风的音频录制为 .wav 格式并使用 Xamarin.ios 中的 AudioUnit 进行播放,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34869534/