I'm writing some C++ code that will have to send data over TCP/IP. I want this code to be portable on Linux/Windows/Osx. Now, as it is the first time I write portable network code, I basically need some simple functions to add to certain objects like:
class myclass{
...member...
public:
string serialize(){
std::ostringstream out();
out << member1;
out << member2;
out << member3;
return out.str();
}
}
... which is all I need for now. Anyway I started reading ostringstream related docs and turns out the binary/text problem. In fact it will convert line breaks to the right sequence of everysystem. Suppose for example that a member is a pointer to const char* foo = "Hello\nMan\n", that will be translated in certain byte sequence on linux, another on windows... and so on. My bytes will go on a packet over the internet, a different OS machine will read them and I think trouble will occurr... Now I read that I might initialize ostringstream
with ostringstream(ios::bin)
... Will it solve the problem (provided that I will use a de-serialization function that will use a istringstream(ios::bin)
??? I'm confused about the whole picture, if you may spend a few clarifying lines that'll be much appreciated.
Thanks.
Seconded -- use a tested serialization library like the aforementioned Boost::Serialization or Google Protocol Buffers. These should not introduce further dependencies.
If you are open to a whole new framework, then Qt also has Qt Serialization
Why do it all manually if there are great libraries like Boost.Serialization that you can build on?
From their goals:
Also of interest for you might be points 4 and 5:
Another option is the ACE framework. It provides serialization/deserialization with CORBA marshalling (see classes ACE_InputCDR and ACE_OutputCDR). If you don't know ACE, this is a huge framework including a complete CORBA-Runtime. But you only need the core ACE libraries for serialization.