我需要帮助从我的树莓PI(Python)通过TCP传输PNG图像到我的Android应用程序(Java)。我花了近两个星期的时间来理解和解决这个问题,所以任何帮助都会非常感谢。
我已经建立了一个客户机-服务器架构,这样我的Raspberry Pi3记录音频,对其进行一些分析,然后(通过TCP)将数据发送到Android应用程序,显示在应用程序屏幕上。记录和分析完成后,我就可以连接并传输显示在应用程序上的字符串数据,没有问题。然而,我一直没有成功地将一个图像从rpi传输到android应用程序。所以基本上,图像存储在rpi上,我试图将图像传输到应用程序以显示它。
当前实施:
关于rpi(python):正如我所说,发送字符串并在android应用程序上显示它们是没有任何问题的。当我发送音频分析的图像部分时,我首先发送一个字符串,上面写着“?启动“以便Android端知道将要发送图像而不是字符串(并等待更新GUI,直到它接收到整个图像)。然后,我打开存储在rpi上的图像,将整个图像作为字节数组读取(通常大约40-50k字节)。我得到字节数组的长度并将其作为字符串发送到android应用程序。最后,我将字节数组发送给android,它等待来自应用程序的ok消息。所有这些都能正常工作,不会报告任何错误。
在Android应用程序(Java)上:当应用程序接收到“?启动“string”,然后它使用一个缓冲读卡器(这是我用来读取之前成功传输到应用程序的字符串数据)来读取图像字节数组的大小。然后,我创建一个缓冲区,一次最多读取1024个字节,msg_buff
将保存图像的整个字节数组。在无限while循环中,我有一个名为baos
的datainputstream,将字节读取到in
并返回读取的字节数。然后,我将msg_buff
的内容添加到msg_buff
中。一旦从baos
读取的字节数为-1或in
(仅为读取的字节总数)大于或等于图像字节数组的大小,while循环将断开。然后,我将尝试将图像保存到android内部存储,然后稍后将其加载到img_offset
中显示。这段代码成功地读取字节,直到有大约2000-3000个字节需要读取,然后它似乎冻结在ImageView
行。我还没能通过这一点,所以我不知道是否将图像保存到内部存储器中,然后以这种方式将其加载到int bytes_read = in.read(msg_buff, 0, byte_size)
也可以。我相信这是冻结的,因为一些字节丢失或不从Python发送到Java。有人知道我怎么解决这个问题吗?
从python服务器读取图像数据的代码位于ImageView
方法中。
tcpclient.java
import android.content.Context;
import android.content.ContextWrapper;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.util.Log;
import java.io.*;
import java.net.InetAddress;
import java.net.Socket;
public class TcpClient {
public static final String SERVER_IP = myIPAddress; //your computer IP address
public static final int SERVER_PORT = myPortNumber;
// message to send to the server
private String mServerMessage;
// sends message received notifications
private OnMessageReceived mMessageListener = null;
// while this is true, the server will continue running
private boolean mRun = false;
// used to send messages
private PrintWriter mBufferOut;
// used to read messages from the server
private BufferedReader mBufferIn;
/**
* Constructor of the class. OnMessagedReceived listens for the messages received from server
*/
public TcpClient(OnMessageReceived listener) {
mMessageListener = listener;
}
/**
* Sends the message entered by client to the server
*
* @param message text entered by client
*/
public void sendMessage(String message) {
if (mBufferOut != null && !mBufferOut.checkError()) {
mBufferOut.println(message);
mBufferOut.flush();
}
}
/**
* Close the connection and release the members
*/
public void stopClient() {
Log.i("Debug", "stopClient");
mRun = false;
if (mBufferOut != null) {
mBufferOut.flush();
mBufferOut.close();
}
mMessageListener = null;
mBufferIn = null;
mBufferOut = null;
mServerMessage = null;
}
public void run() {
mRun = true;
try {
//here you must put your computer's IP address.
InetAddress serverAddr = InetAddress.getByName(SERVER_IP);
Log.e("TCP Client", "C: Connecting...");
//create a socket to make the connection with the server
Socket socket = new Socket(serverAddr, SERVER_PORT);
try {
InputStream sin = socket.getInputStream();
OutputStream sout = socket.getOutputStream();
DataInputStream in = new DataInputStream(sin);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
mBufferIn = new BufferedReader(new InputStreamReader(socket.getInputStream()));
//in this while the client listens for the messages sent by the server
while (mRun) {
mServerMessage = mBufferIn.readLine();
if (mServerMessage != null && mMessageListener != null) {
//Check if data is image
if(mServerMessage.equals("?start"))
{
mServerMessage = mBufferIn.readLine();
String fileName = "";
if(mServerMessage.equals("signal"))
{
fileName = "signal.jpeg";
}
else if(mServerMessage.equals("spec"))
{
fileName = "spec.jpeg";
}
// Get length of image byte array
int size = Integer.parseInt(mBufferIn.readLine());
Log.i("Debug:", "image message size: "+size);
// Create buffers
byte[] msg_buff = new byte[1024];
//byte[] img_buff = new byte[size];
int img_offset = 0;
while(true){
int byte_size = msg_buff.length;
int bytes_read = in.read(msg_buff, 0, byte_size);
Log.i("Debug:", "image message bytes:" + bytes_read);
if(bytes_read == -1){
break;
}
//copy bytes into img_buff
//System.arraycopy(msg_buff, 0, img_buff, img_offset, bytes_read);
baos.write(msg_buff, 0, bytes_read);
img_offset += bytes_read;
Log.i("Debug:", "image message bytes read:"+img_offset);
if( img_offset >= size)
{
break;
}
}
try{
byte[] data = baos.toByteArray();
ByteArrayInputStream bais = new ByteArrayInputStream(data);
ContextWrapper cw = new ContextWrapper(ApplicationContextProvider.getContext());
File directory = cw.getDir("imageDir", Context.MODE_PRIVATE);
File mypath = new File(directory, fileName);
//Bitmap bitmap = BitmapFactory.decodeByteArray(img_buff, 0, img_buff.length);
Bitmap bitmap = BitmapFactory.decodeStream(bais);
FileOutputStream fos = new FileOutputStream(mypath);
//Use compress method on Bitmap object to write image to OutputStream
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, fos);
fos.flush();
fos.close();
//Send OK
byte[] OK = new byte[] {0x4F, 0x4B};
sout.write(OK);
} catch (Exception e) {
Log.i("Debug:", "image message" +e);
e.printStackTrace();
}
}
//call the method messageReceived from MyActivity class
mMessageListener.messageReceived(mServerMessage);
}
}
Log.e("RESPONSE FROM SERVER", "S: Received Message: '" + mServerMessage + "'");
} catch (Exception e) {
Log.e("TCP", "S: Error", e);
} finally {
//the socket must be closed. It is not possible to reconnect to this socket
// after it is closed, which means a new socket instance has to be created.
socket.close();
}
} catch (Exception e) {
Log.e("TCP", "C: Error", e);
}
}
//Declare the interface. The method messageReceived(String message) must be implemented in the MainActivity
//class in asynckTask doInBackground
public interface OnMessageReceived {
void messageReceived(String message);
}
}
main活动.java:
import android.app.Application;
import android.content.Context;
import android.content.ContextWrapper;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.drawable.Drawable;
import android.os.AsyncTask;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.*;
import org.apache.commons.codec.binary.Base64;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
public class MainActivity extends AppCompatActivity {
private TcpClient mTcpClient;
private TextView dbView;
private TextView roomView;
private TextView classView;
private TextView statusView;
private TextView timeView;
private ImageView signalView;
private ImageView specView;
private Button getAnalysis;
private Button disconnect;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
getAnalysis = findViewById(R.id.get_analysis);
dbView = findViewById(R.id.db_level);
roomView = findViewById(R.id.RoomsValues);
classView = findViewById(R.id.ClassValues);
timeView = findViewById(R.id.timeStamp);
signalView = findViewById(R.id.audioPic);
specView = findViewById(R.id.specPic);
statusView = findViewById(R.id.status);
disconnect = findViewById(R.id.disconnect);
getAnalysis.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v)
{
statusView.setText("Connecting to Auris...\nRoom analytics will arrive shortly.");
new ConnectTask().execute("");
}
});
disconnect.setOnClickListener(new View.OnClickListener(){
@Override
public void onClick(View v)
{
mTcpClient.stopClient();
statusView.setText("Disconnected from Auris.\nReconnect to receive room analysis updates.");
}
});
}
public class ConnectTask extends AsyncTask<String, String, TcpClient> {
@Override
protected TcpClient doInBackground(String... message) {
//we create a TCPClient object and
mTcpClient = new TcpClient(new TcpClient.OnMessageReceived() {
@Override
//here the messageReceived method is implemented
public void messageReceived(String message) {
//this method calls the onProgressUpdate
publishProgress(message);
Log.i("Debug","Input message: " + message);
}
});
//statusView.setText("Get analysis from Auris as it is collected.");
mTcpClient.run();
return null;
}
@Override
protected void onProgressUpdate(String... values) {
super.onProgressUpdate(values);
//Store string of values sent from Auris device
String str = values[0];
//if data starts with +, then it is the string data
if(str.startsWith("+"))
{
//Split values around spaces
/*
Values in data indices
0-8 are room log likelihoods
9-12 are class log likelihoods
13 is dbA level
14 is room model best matched
15 is class model best matched
*/
// Remove +
str = str.substring(1);
String data[]= str.split(" ");
String roomData = "";
String classData = "";
String status;
for(int i = 0; i < 9; i++)
{
roomData = roomData.concat(data[i]);
roomData = roomData.concat("\n");
}
roomView.setText(roomData);
for(int i = 9; i < 13; i++)
{
classData = classData.concat(data[i]);
classData = classData.concat("\n");
}
classView.setText(classData);
dbView.setText(data[13]);
status = "The room most closely matches " + data[14] + " room model & " + data[15] + " class model.";
statusView.setText(status);
}
else if (str.startsWith("TIME"))
{
// Remove "TIME"
str.substring(4);
String message = "This room profile represents the room at " + str + ".";
timeView.setText(message);
}
else
{
try {
String fileName = "";
if(str.equals("signal"))
{
fileName = "signal.jpeg";
}
else if(str.equals("spec"))
{
fileName = "spec.jpeg";
}
ContextWrapper cw = new ContextWrapper(ApplicationContextProvider.getContext());
File directory = cw.getDir("imageDir", Context.MODE_PRIVATE);
File file = new File(directory, fileName);
Bitmap bitmap = BitmapFactory.decodeStream(new FileInputStream(file));
signalView.setImageBitmap(bitmap);
} catch (FileNotFoundException e){
e.printStackTrace();
}
}
Log.i("onProgressUpdate",values[0]);
}
}
}
发送图像数据的python代码:
def send_image_to_byte_array(image_file, conn, label):
with open(image_file, "rb") as imageFile:
content = imageFile.read()
conn.sendall("?start\n".encode('utf-8'))
conn.sendall(label.encode('utf-8'))
size = len(content)
strSize = str(size) + "\n"
conn.sendall(strSize.encode('utf-8'))
conn.sendall(content)
据我所知,并不是所有的字节图像都成功地从rpi发送到android应用程序。数据丢失会导致Android应用程序挂在tcpclient.java的
run()
方法中的int bytes_read = in.read(msg_buff, 0, byte_size);
行上。从读取不同的帖子来看,使用Stutt.UnPcCy/Pope似乎是在将图像从Python传输到Python时修复了这个问题,但是我不知道如何在Java中实现Struts。我也不确定在Python中使用struct.pack的最佳方法是什么。非常感谢您的帮助!编辑:
我相信问题是持久性。从我所读到的,树莓Pi是小Endodia,Java是大的EnDead。因此,当我读取保存到RasBuriPi的图像并尝试从Python将其传输到Java时,这些问题正在发生。有没有人知道我如何改变Java的从属关系从大到小或者其他方法来解决这个问题?
最佳答案
此问题是由离线读取额外数据引起的(为了填充其内部缓冲区),这使得该数据无法从BufferedReader
获得。
从sample Android BufferedReader implementation中可以看到,调用in.read()
会导致readLine()
尝试填充其内部缓冲区。它将使用其源代码上的任何可用字节来执行此操作。最多8192个字符。而且,如果BufferedReader
已经读取了这些字节,那么当您试图从InputStream
获取这些字节时,它们不会在那里。这会打乱整个尺寸计算系统,意味着您最终会在BufferedReader
中阻塞,因为您没有读取预期的所有数据。
最方便的解决方案可能是实现您自己的in.read()
版本,该版本一次组装一个字节的字符串,直到它到达'\n'。毕竟,您需要in.read()
的唯一原因是readLine()
函数。
关于java - 从Python服务器到Android客户端的图像数据丢失(Endian问题??),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/57184485/