My python script uses subprocess to call an another script, which produces output very slow(line-by-line basis). I would like to write the output line by line to file not when the whole process ends and writes the entire output as string.The following code writes the output to "file" when the "script" ends.
args = ("script")
file = open('output.txt', 'w')
subprocess.Popen(args,stdout=file)
Is it even possible ? Thanx, Chris
Thought I'd share a solution that doesn't use .poll(), .wait() or .communicate(). A couple of points:
import codecs
because my output includes East Asian UTF-8 texttry:
to filter out corrupted/invalid UTF-8 text'\x0a'
to force Linux newline regardless of the platform.for line in iter(subproc.stderr.readline, ''):
if you need to capture stderrCode:
Yes, it is possible. Here is a function that I wrote for a test harness use to do unit testing of Python shell scripts.
The function returns a tuple which contains the shell return code issues by
sys.exit()
, the standard output text, and the standard error output text. They are both text strings so you would need to usesplitlines
to break them into lines before processing.If you really need to interact with the output, line by line, then it is probably better to use pexpect rather than the
subprocess
module.You can interact with the process using poll so that you can attempt to interact with it line by line:
For example:
I had the same problem for a programming language I'm working on, and ended up doing this: https://github.com/perimosocordiae/plumbum/blob/master/lib/stdlib.py#L21
Unfortunately, it involves reading from the output stream a character at a time, accumulating the line until a newline is found. It works, though, and I don't know of any other way to get the same behavior.