If I do something like the following:
ifstream file;
file.open("somefile", ios::binary);
unsigned int data;
file >> data;
My stream will always set the failbit
and the data
will remain uninitialized. However, if I read a char
or unsigned char
instead, the stream is fine. perror()
is telling me "result too large".
The only thing I saw on Google was a suggestion saying that operator>>
shouldn't be used for binary data (prefer read()
), but I find the operator to be cleaner and easier to use -- and it doesn't require casting everything.
Can someone explain this issue?
The iostream extraction operator (>>) attempts to interpret numerical strings separated by whitespace, not binary data. There are many different ways to encode an unsigned integer in binary form (e.g. a 32-bit 2's complement representation in little-endian byte order). That's why you must use the read/write functions to operate on such binary buffers.
However, nothing prevents you from implementing your own class for serializing binary data in whatever form you wish using the insertion and extraction operators. Such a class would likely use the read function of an ifstream object internally. Alternatively, the boost serialization library may already hold exactly what you want.
It should be done as described by you. However, the C++ standard designers are not very elegant. In fact, there are a lot of flaws in the design of C++, even C++11 and C++14 has lots of defects.
The ideal C++ design should be that:
1.For text file:
This will read in 3 strings and parse into integer, float and double, and store them into i, j, and k respectively.
2.For binary file:
This will read in 4/8 bytes (depending on whether int is 32-bit or 64-bit), 4 bytes and 8 bytes binary data and store them into i, j, and k respectively.
Unfortunately, the current design is to report an error for Case 2. Maybe this can be achieved in C++22.