I have the following code on Windows written in C++ with Visual Studio:
FILE* outFile = fopen(outFileName, "a,ccs=UTF-8");
fwrite(buffer.c_str(), buffer.getLength() * sizeof(wchar_t), 1, outFile);
std::wstring newLine = L"\n";
fwrite(newLine.c_str(), sizeof(wchar_t), 1, outFile);
fclose(outFile);
This correctly writes out the file in UTF-8. When I compile and run the same code on Linux, the file is created, but it is zero length. If I change the fopen command as follows, the file is created and non-zero length, but all non-ASCII characters display as garbage:
FILE* outFile = fopen(outFileName, "a");
Does ccs=UTF-8 not work on Linux gcc?
No, the extensions done on Windows do not work on Linux, OS-X, Android, iOS and everywhere else. The Microsoft just makes those extensions to achieve that you write incompatible code with other platforms.
Convert your wide string to byte string that contains UTF-8, then write the bytes to file like usual. There are lot of ways to do it but most standard-compatible way is perhaps like that: