I have a Java TCP Server Socket program that is expecting about 64 bytes of data from a piece of remote hardware. The Server code is:
public void run () throws Exception
{
//Open a socket on localhost at port 11111
ServerSocket welcomeSocket = new ServerSocket(11111);
while(true) {
//Open and Accept on Socket
Socket connectionSocket = welcomeSocket.accept();
DataInputStream dIn = new DataInputStream(connectionSocket.getInputStream());
int msgLen = dIn.readInt();
System.out.println("RX Reported Length: "+ msgLen);
byte[] msg = new byte[msgLen];
if(msgLen > 0 ) {
dIn.readFully(msg);
System.out.println("Message Length: "+ msg.length);
System.out.println("Recv[HEX]: " + StringTools.toHexString(msg));
}
}
}
This works correctly as I am able to test locally with a simple ACK program:
public class ACK_TEST {
public static void main (String[] args)
{
System.out.println("Byte Sender Running");
try
{
ACK_TEST obj = new ACK_TEST ();
obj.run();
}
catch (Exception e)
{
e.printStackTrace ();
}
}
public void run () throws Exception
{
Socket clientSocket = new Socket("localhost", 11111);
DataOutputStream dOut = new DataOutputStream(clientSocket.getOutputStream());
byte rtn[] = null;
rtn = new byte[1];
rtn[0] = 0x06; // ACK
dOut.writeInt(rtn.length); // write length of the message
dOut.write(rtn); // write the message
System.out.println("Byte Sent");
clientSocket.close();
}
}
And this correctly produces this output from the Server side:
However, when I deploy the same Server code on the Raspberry Pi and the hardware sends data to it, the data length is far greater and causes a heap memory issue (Even with the Heap pre-set at 512MB, which is definitely incorrect and unnecessary)
My presumption is I am reading the data wrong from the TCP socket as from the debug from the hardware, it's certainly not sending packets of this size.
Update: I have no access to the Client source code. I do however need to take the input TCP data stream, place it into a byte array, and then another function (Not shown) parses out some known HEX codes. That function expects a byte array input.
Update: I reviewed the packet documentation. It is a 10 byte header. The first Byte is a protocol identifier. The next 2 bytes is the Packet Length (Total number of bytes in the packet, including all the header bytes and checksum) and the last 7 are a Unique ID. Therefore, I need to read those 2 bytes and create a byte array that size.