Background
I am manually writing a large data block into a binary file with System.IO.BinaryWriter
. I have chosen this due to the improved performance compared to a wide variety of other means of serialization & deserialization (I am currently deserializing with System.IO.BinaryReader
).
Question
I may need to use the serialized formats in other programming languages like Java
and/or Rust
. Would they be able to understand the raw binary written by System.IO.BinaryWriter
and read it in a similar manner to .NETs 'System.IO.BinaryReader'?
(I am assuming that the new plaforms (Java/Rust) will have implicit knowledge of the specific order in which the raw binary was written.)
Side Info
I am aware that protocol buffers is meant to be a performant and language agnostic framework for serializing/deserializing in this scenario but: (1) I am using F# and it struggles with the discriminated unions (2) It wasn't really that much effort to write my own custom serializer as my types aren't too complex
Yes, you can.
For numeric types, data is written in Little Endian Format
It depends on the types you write with the
BinaryWriter
.byte
,sbyte
andbyte[]
: no problem.(U)IntXX
: matter of endianness. The .NETBinaryWriter
dumps these types in little endian format.float
anddouble
: If both systems use the same IEEE 754 standard, and both systems use the same endianness, then it is no problem.decimal
: This is a .NET-specific type, similar toCurrency
but uses different format. Use carefully.char
andchar[]
: Uses the currentEncoding
of theBinaryWriter
. Use the same encoding on both sides and everything is alright.string
: The length of the string is encoded in the so-called 7 bit-encoded int format (1 byte for up to 127 chars, etc), and uses the current encoding. To make things compatible maybe you should dump character arrays with manually dumped length information.