I suspect this has to do with endianness but I'm not sure how to fix it. I have a C++ client telling a Java server how many bytes it's about to send where the Java server just calls readInt() on the input stream. Then the server goes onto read the rest of the data.
At the moment if the C++ server calls:
char l = '3';
BytesSent = send( Socket, &l, 1, 0 );
Then the corresponding Java side is:
int lBytesSent = m_InDataStream.readInt();
m_AckNack = new byte[lBytesSent];
m_InDataStream.read(m_AckNack)
Bytes lBytesSent tends to be some massive value which then just throws an exception when it comes to creating the array (not surprisingly)
The C++ socket is simply opened up with:
Socket = socket(AF_INET, SOCK_STREAM, 0);
Option = 1000;
setsockopt(Socket, SOL_SOCKET, SO_RCVTIMEO, (char *) &Option, sizeof(Option));
server.sin_family = AF_INET;
server.sin_port = htons(Port);
server.sin_addr.s_addr = INADDR_ANY;
memset(&(server.sin_zero), '\0', 8);
connect(Socket, (sockaddr*)&server, sizeof(server));
And the Java side:
ServerSocket listener = new ServerSocket(port);
Socket server;
server = listener.accept();
Removing the error checking for clarity.
Any suggestions would be great
Many Thanks
Mark
Try running the number through htonl before sending it (on the C++ side):