我基本上是试图记录来自webrtc流的音频块,我已经能够借助此资源HTML Audio Capture streaming to Node.js发送二进制数据。
我正在使用netty-socketio,因为该库在客户端与socket-io配合良好。
这是我的服务器端点:
server.addEventListener("audio-blob", byte[].class, (socketIOClient, bytes, ackRequest) -> {
byteArrayList.add(bytes);
});
server.addEventListener("audio-blob-end", Object.class, (socket, string, ackRequest) -> {
ByteArrayInputStream in = new ByteArrayInputStream(byteArrayList.getArray());
AudioInputStream audiIn = new AudioInputStream(in, getAudioFormat(), 48000l);
AudioFileFormat.Type fileType = AudioFileFormat.Type.WAVE;
File wavFile = new File("RecordAudio.wav");
AudioSystem.write(audiIn,fileType,wavFile);
});
格式设置:
public static AudioFormat getAudioFormat() {
float sampleRate = 48000;
int sampleSizeInBits = 8;
int channels = 2;
boolean signed = true;
boolean bigEndian = true;
AudioFormat format = new AudioFormat(sampleRate, sampleSizeInBits,
channels, signed, bigEndian);
return format;
}
我正在使用此类来收集字节数组(是的,我知道这种解决方案的风险)
class ByteArrayList {
private List<Byte> bytesList;
public ByteArrayList() {
bytesList = new ArrayList<Byte>();
}
public void add(byte[] bytes) {
add(bytes, 0, bytes.length);
}
public void add(byte[] bytes, int offset, int length) {
for (int i = offset; i < (offset + length); i++) {
bytesList.add(bytes[i]);
}
}
public int size(){
return bytesList.size();
}
public byte[] getArray() {
byte[] bytes = new byte[bytesList.size()];
for (int i = 0; i < bytesList.size(); i++) {
bytes[i] = bytesList.get(i);
}
return bytes;
}
}
尽管生成的wav文件仅播放噪音,但没有录制。我究竟做错了什么?
最佳答案
当Google:四处寻找答案时,我偶然发现了该资源how to save a wav file
我做错的是我在AudioInputstream构造函数参数上具有固定大小:
new AudioInputStream(in, getAudioFormat(), 48000l)
将其更改为:
new AudioInputStream(in, getAudioFormat(),byteArrayList.getArray().length);
关于node.js - 处理WebRTC byte []流?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37576122/