Writing unicode strings via sys.stdout in Python

2020-01-27 02:28发布

问题:

Assume for a moment that one cannot use print (and thus enjoy the benefit of automatic encoding detection). So that leaves us with sys.stdout. However, sys.stdout is so dumb as to not do any sensible encoding.

Now one reads the Python wiki page PrintFails and goes to try out the following code:

$ python -c 'import sys, codecs, locale; print str(sys.stdout.encoding); \
  sys.stdout = codecs.getwriter(locale.getpreferredencoding())(sys.stdout);

However this too does not work (at least on Mac). Too see why:

>>> import locale
>>> locale.getpreferredencoding()
'mac-roman'
>>> sys.stdout.encoding
'UTF-8'

(UTF-8 is what one's terminal understands).

So one changes the above code to:

$ python -c 'import sys, codecs, locale; print str(sys.stdout.encoding); \
  sys.stdout = codecs.getwriter(sys.stdout.encoding)(sys.stdout);

And now unicode strings are properly sent to sys.stdout and hence printed properly on the terminal (sys.stdout is attached the terminal).

Is this the correct way to write unicode strings in sys.stdout or should I be doing something else?

EDIT: at times--say, when piping the output to less--sys.stdout.encoding will be None. in this case, the above code will fail.

回答1:

It's not clear to my why you wouldn't be able to do print; but assuming so, yes, the approach looks right to me.



回答2:

export PYTHONIOENCODING=utf-8

will do the job, but can't set it on python itself ...

what we can do is verify if isn't setting and tell the user to set it before call script with :

if __name__ == '__main__':
    if (sys.stdout.encoding is None):
        print >> sys.stderr, "please set python env PYTHONIOENCODING=UTF-8, example: export PYTHONIOENCODING=UTF-8, when write to stdout."
        exit(1)


回答3:

Best idea is to check if you are directly connected to a terminal. If you are, use the terminal's encoding. Otherwise, use system preferred encoding.

if sys.stdout.isatty():
    default_encoding = sys.stdout.encoding
else:
    default_encoding = locale.getpreferredencoding()

It's also very important to always allow the user specify whichever encoding she wants. Usually I make it a command-line option (like -e ENCODING), and parse it with the optparse module.

Another good thing is to not overwrite sys.stdout with an automatic encoder. Create your encoder and use it, but leave sys.stdout alone. You could import 3rd party libraries that write encoded bytestrings directly to sys.stdout.



回答4:

There is an optional environment variable "PYTHONIOENCODING" which may be set to a desired default encoding. It would be one way of grabbing the user-desired encoding in a way consistent with all of Python. It is buried in the Python manual here.



回答5:

This is what I am doing in my application:

sys.stdout.write(s.encode('utf-8'))

This is the exact opposite fix for reading UTF-8 names from argv:

for file in sys.argv[1:]:
    file = file.decode('utf-8')

This is very ugly (IMHO) as it force you to work with UTF-8.. which is the norm on Linux/Mac, but not on windows... Works for me anyway :)