C++ writing UTF-8 on Linux

2019-08-02 09:00发布

I have the following code on Windows written in C++ with Visual Studio:

  FILE* outFile = fopen(outFileName, "a,ccs=UTF-8");
  fwrite(buffer.c_str(), buffer.getLength() * sizeof(wchar_t), 1, outFile);
  std::wstring newLine = L"\n";
  fwrite(newLine.c_str(), sizeof(wchar_t), 1, outFile);
  fclose(outFile);

This correctly writes out the file in UTF-8. When I compile and run the same code on Linux, the file is created, but it is zero length. If I change the fopen command as follows, the file is created and non-zero length, but all non-ASCII characters display as garbage:

  FILE* outFile = fopen(outFileName, "a");

Does ccs=UTF-8 not work on Linux gcc?

1条回答
啃猪蹄的小仙女
2楼-- · 2019-08-02 09:33

No, the extensions done on Windows do not work on Linux, OS-X, Android, iOS and everywhere else. The Microsoft just makes those extensions to achieve that you write incompatible code with other platforms.

Convert your wide string to byte string that contains UTF-8, then write the bytes to file like usual. There are lot of ways to do it but most standard-compatible way is perhaps like that:

#include <iostream>
#include <string>
#include <codecvt>
#include <locale>

using Converter = std::wstring_convert<std::codecvt_utf8_utf16<wchar_t>, wchar_t>; 

int main()
{
    std::wstring wide = L"Öö Tiib                                                                     
查看更多
登录 后发表回答