I'm writing a cross-platform application in C++. All strings are UTF-8-encoded internally. Consider the following simplified code:
#include <string>
#include <iostream>
int main() {
std::string test = u8"Greek: αβγδ; German: Übergrößenträger";
std::cout << test;
return 0;
}
On Unix systems, std::cout
expects 8-bit strings to be UTF-8-encoded, so this code works fine.
On Windows, however, std::cout
expects 8-bit strings to be in Latin-1 or a similar non-Unicode format (depending on the codepage). This leads to the following output:
Greek: ╬▒╬▓╬│╬┤; German: ├£bergr├Â├ƒentr├ñger
What can I do to make std::cout
interpret 8-bit strings as UTF-8 on Windows?
This is what I tried:
#include <string>
#include <iostream>
#include <io.h>
#include <fcntl.h>
int main() {
_setmode(_fileno(stdout), _O_U8TEXT);
std::string test = u8"Greek: αβγδ; German: Übergrößenträger";
std::cout << test;
return 0;
}
I was hoping that _setmode
would do the trick. However, this results in the following assertion error in the line that calls operator<<
:
Microsoft Visual C++ Runtime Library
Debug Assertion Failed!
Program: d:\visual studio 2015\Projects\utf8test\Debug\utf8test.exe File: minkernel\crts\ucrt\src\appcrt\stdio\fputc.cpp Line: 47
Expression: ( (_Stream.is_string_backed()) || (fn = _fileno(_Stream.public_stream()), ((_textmode_safe(fn) == __crt_lowio_text_mode::ansi) && !_tm_unicode_safe(fn))))
For information on how your program can cause an assertion failure, see the Visual C++ documentation on asserts.
At last, I've got it working. This answer combines input from Miles Budnek, Paul, and mkluwe with some research of my own. First, let me start with code that will work on Windows 10. After that, I'll walk you through the code and explain why it won't work out of the box on Windows 7.
The code starts by setting the code page, as suggested by Miles Budnik. This will tell the console to interpret the byte stream it receives as UTF-8, not as some variation of ANSI.
Next, there is a problem in the STL code that comes with Visual Studio.
std::cout
prints its data to a stream buffer of typestd::basic_filebuf
. When that buffer receives a string (viastd::basic_streambuf::sputn()
), it won't pass it on to the underlying file as a whole. Instead, it will pass each byte separately. As explained by mkluwe, if the console receives a UTF-8 byte sequence as individual bytes, it won't interpret them as a single code point. Instead, it will treat them as multiple characters. Each byte within a UTF-8 byte sequence is an invalid code point on its own, so you'll see �'s instead. There is a related bug report for Visual Studio, but it was closed as By Design. The workaround is to enable buffering for the stream. As an added bonus, that will give you better performance. However, you may now need to regularly flush the stream as I do withstd::endl
, or your output may not show.Lastly, the Windows console supports both raster fonts and TrueType fonts. As pointed out by Paul, raster fonts will simply ignore the console's code page. So non-ASCII Unicode characters will only work if the console is set to a TrueType Font. Up until Windows 7, the default is a raster font, so the user will have to change it manually. Luckily, Windows 10 changes the default font to Consolas, so this part of the problem should solve itself with time.
std::cout
is doing exactly what it should: it's sending your UTF-8 encoded text along to the console, but your console will interpret those bytes using its current code page. You need to set your program's console to the UTF-8 code page:It would be great if Windows switched the default code page to UTF-8, but they likely can't due to backwards-compatibility concerns.
Some Unicode characters can't be displayed properly in a console window even if you've changed the code page, because your font does not support it. For example, you need to install a font that supports Arabic if you want to show Arabic characters.
This stackoverflow page should be helpful.
By the way, the Unicode version of console APIs (such as WriteConsoleW) won't come to the rescue, because they internally call their corresponding Windows code page version APIs (such as WriteConsoleA). Neither will std::wcout help, because it will convert wchar_t string to char string internally.
It seems that windows console window doesn't support Unicode well, I suggest you use MessageBox instead.
Set the console output encoding to UTF-8 using the following Windows API call:
Documentation for that function is available on Windows Dev Center.
The problem is not
std::cout
but the windows console. Using C-stdio you will get theü
withfputs( "\xc3\xbc", stdout );
after setting the UTF-8 codepage (either usingSetConsoleOutputCP
orchcp
) and setting a Unicode supporting font in cmd's settings (Consolas should support over 2000 characters and there are registry hacks to add more capable fonts to cmd).If you output one byte after the other with
putc('\xc3'); putc('\xbc');
you will get the double tofu as the console gets them interpreted separately as illegal characters. This is probably what the C++ streams do.See UTF-8 output on Windows console for a lenghty discussion.
For my own project, I finally implemented a
std::stringbuf
doing the conversion to Windows-1252. I you really need full Unicode output, this will not really help you, however.An alternative approach would be overwriting
cout
's streambuf, usingfputs
for the actual output:I turned off output buffering here to prevent it to interfere with unfinished UTF-8 byte sequences.