I'm using the miscutil library to communicate between and Java and C# application using a socket. I am trying to figure out the difference between the following code (this is Groovy, but the Java result is the same):
import java.io.*
def baos = new ByteArrayOutputStream();
def stream = new DataOutputStream(baos);
stream.writeInt(5000)
baos.toByteArray().each { println it }
/* outputs - 0, 0, 19, -120 */
and C#:
using (var ms = new MemoryStream())
using (EndianBinaryWriter writer = new EndianBinaryWriter(EndianBitConverter.Big, ms, Encoding.UTF8))
{
writer.Write(5000);
ms.Position = 0;
foreach (byte bb in ms.ToArray())
{
Console.WriteLine(bb);
}
}
/* outputs - 0, 0, 19, 136 */
As you can see, the last byte is -120
in the Java version and 136
in C#.
This has to do with the fact that bytes in Java (the JVM) are signed, and in C# they are not. It has nothing to do with big or little endian representation.
In other words, Java's bytes range from
-128 - 127
, and C# bytes range from0 - 255
.Thus, when trying to represent a byte above
127
in Java, you overflow and wrap around to-120
.From the Java tutorials:
From the MSDN: