android - Android将音频流传输到Java Server

标签 android audio

我正在开发一个Android应用程序,用户可以讲话,该应用程序会将声音发送到计算机Java服务器。我使用了以下代码,但似乎无法正常工作。我从这里使用我的IPV4地址和代码
Stream Live Android Audio to Server

错误是我运行此应用程序时出现的,它告诉我它已停止工作,但是还有其他错误,有人可以帮助我进行修复吗?谢谢。

1)我创建一个android项目,并将代码放在MainActivity中。
2)我创建一个Java项目并将其放在一个类中。
3)我运行服务器。
4)我插入我的android设备并在设备上运行它。
5)当我按开始(开始录制)时,它不起作用。

我的Android应用:

    package com.example.mictest;
import java.io.IOException;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.UnknownHostException;

import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;

public class MainActivity extends Activity {
private Button startButton,stopButton;

public byte[] buffer;
public static DatagramSocket socket;
private int port=50005;
AudioRecord recorder;

private int sampleRate = 44100;
private int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;    
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;       
int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
private boolean status = true;




@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    startButton = (Button) findViewById (R.id.start_button);
    stopButton = (Button) findViewById (R.id.stop_button);

    startButton.setOnClickListener (startListener);
    stopButton.setOnClickListener (stopListener);

    minBufSize += 2048;
    System.out.println("minBufSize: " + minBufSize);
}

private final OnClickListener stopListener = new OnClickListener() {

    @Override
    public void onClick(View arg0) {
                status = false;
                recorder.release();
                Log.d("VS","Recorder released");
    }

};

private final OnClickListener startListener = new OnClickListener() {

    @Override
    public void onClick(View arg0) {
                status = true;
                startStreaming();           
    }

};

public void startStreaming() {


    Thread streamThread = new Thread(new Runnable() {

        @Override
            public void run() {
                try {

                    DatagramSocket socket = new DatagramSocket();
                    Log.d("VS", "Socket Created");

                    byte[] buffer = new byte[minBufSize];

                    Log.d("VS","Buffer created of size " + minBufSize);
                    DatagramPacket packet;

                    final InetAddress destination = InetAddress.getByName("172.20.129.255");
                    Log.d("VS", "Address retrieved");


                    recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat,minBufSize*10);
                Log.d("VS", "Recorder initialized");

                recorder.startRecording();


                while(status == true) {


                    //reading data from MIC into buffer
                    minBufSize = recorder.read(buffer, 0, buffer.length);

                    //putting buffer in the packet
                    packet = new DatagramPacket (buffer,buffer.length,destination,port);

                    socket.send(packet);
                    Log.d("VS", "MinBufferSize: " +minBufSize);


                }


            } catch(UnknownHostException e) {
                Log.e("VS", "UnknownHostException");
            } catch (IOException e) {
                e.printStackTrace();
                Log.e("VS", "IOException");
            } 
        }

    });
    streamThread.start();
 }
 }

我的Java服务器:
    package com.datagram;

import java.io.ByteArrayInputStream;
import java.net.DatagramPacket;
import java.net.DatagramSocket;

import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.SourceDataLine;

class Server {

AudioInputStream audioInputStream;
static AudioInputStream ais;
static AudioFormat format;
static boolean status = true;
static int port = 50005;
static int sampleRate = 44100;

public static void main(String args[]) throws Exception {


    DatagramSocket serverSocket = new DatagramSocket(50005);

    /**
     * Formula for lag = (byte_size/sample_rate)*2
     * Byte size 9728 will produce ~ 0.45 seconds of lag. Voice slightly broken.
     * Byte size 1400 will produce ~ 0.06 seconds of lag. Voice extremely broken.
     * Byte size 4000 will produce ~ 0.18 seconds of lag. Voice slightly more broken then 9728.
     */

    byte[] receiveData = new byte[4000];

    format = new AudioFormat(sampleRate, 16, 1, true, false);

    while (status == true) {
        DatagramPacket receivePacket = new DatagramPacket(receiveData,
                receiveData.length);

        serverSocket.receive(receivePacket);

        ByteArrayInputStream baiss = new ByteArrayInputStream(
                receivePacket.getData());

        ais = new AudioInputStream(baiss, format, receivePacket.getLength());
        toSpeaker(receivePacket.getData());

    }
}

public static void toSpeaker(byte soundbytes[]) {
    try {

        DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, format);
        SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);

        sourceDataLine.open(format);

        FloatControl volumeControl = (FloatControl)           sourceDataLine.getControl(FloatControl.Type.MASTER_GAIN);
        volumeControl.setValue(100.0f);

        sourceDataLine.start();
        sourceDataLine.open(format);

        sourceDataLine.start();

        System.out.println("format? :" + sourceDataLine.getFormat());

        sourceDataLine.write(soundbytes, 0, soundbytes.length);
        System.out.println(soundbytes.toString());
        sourceDataLine.drain();
        sourceDataLine.close();
    } catch (Exception e) {
        System.out.println("Not working in speakers...");
        e.printStackTrace();
    }
}
}

最佳答案

我认为这与while循环中的代码有关

while(status == true) {
                //YOU WANT TO RECORD IN YOUR BUFFERSIZE?
                //reading data from MIC into buffer
                minBufSize = recorder.read(buffer, 0, buffer.length);

                //putting buffer in the packet
                packet = new DatagramPacket (buffer,buffer.length,destination,port);

                socket.send(packet);
                Log.d("VS", "MinBufferSize: " +minBufSize);
            }

关于android - Android将音频流传输到Java Server,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23002084/

相关文章:

android - 在 Uno.Platform 中从 ViewModels 设置位图不适用于 Android

android - 标准 9 个阴影补丁生成器生成 9 个无法编译的补丁

audio - C#中的语音检测

android - 计算正在运行的车辆的 x 加速度(东)和 y 加速度(北)

android - 可以通过编程方式关闭WIFI吗

android - 如果使用 FLAG_LAYOUT_NO_LIMITS,则无法使用 setStatusBarColor 设置 StatusBar 颜色

flash - 没有参数的Sound.play()崩溃

audio - 在 netbeans 中再现使用 jfilechooser 选择的音频

linux - 用于音频处理的 Linux 服务器

javascript - 如何使用 Web Audio API 访问输出缓冲区?