Suppose we have following structure:
struct Something {
int i;
};
If I want to write in a file any data of this type(dynamically allocated), I do this:
struct Something *object = malloc(sizeof(struct Something));
object->i = 0; // set member some value
FILE *file = fopen("output_file", "wb");
fwrite(object, sizeof(struct Something), 1 file);
fclose(file);
Now, my questions:
How we do this with a structure what contains pointers? I tested using same method, it worked fine, data could been read, but I want to know if there are any risks?
What you want is called serialization. See also XDR (a portable binary data format) & libs11n (a C++ binary serialization library); you often care about data portability: being able to read the data on some different computer.
"serialization" means to "convert" some complex data structure (e.g. a list, a tree, a vector or even your Something
...) into a (serial) byte stream (e.g. a file, a network connection, etc...), and backwards. Dealing with circular data structures or shared sub-components may be tricky.
You don't want to write raw pointers inside a file (but you could), because the written address probably won't make any sense at the next execution of your program (e.g. because of ASLR), i.e. when you'll read the data again.
Read also about application checkpointing and persistence.
For pragmatic reasons (notably ease of debugging and resilience w.r.t. small software evolution) it is often better to use some textual data format (like e.g. JSON or Yaml) to store such persistent data.
You might also be interested in databases. Look first into sqlite, and also into DBMS ("relational" -or SQL based- ones like PostGreSQL, NoSQL ones like e.g. MongoDB)
The issue is not writing a single dynamically allocated struct
(since you want mostly to write the data content, not the pointer, so it is the same to fwrite
a malloc
-ed struct
or a locally allocated one), it is to serialize complex data structures which use lots of weird internal pointers!
Notice that copying garbage collectors use algorithms similar to serialization algorithms (since both need to scan a complex graph of references).
Also, on today's computers, disk -or network- IO is a lot (e.g. a million times) slower than the CPU, so it makes sense to do some significant computation before writing files.