Im trying to stdout.readline
and put the results (i.e each line, at the time of printing them to the terminal) on a multiprocessing.Queue
for us in another .py file. However, the call:
res = subprocess.Popen(command, stdout=subprocess.PIPE, bufsize=1 )
with res.stdout:
for line in iter(res.stdout.readline, b''):
print line
res.wait()
Will block and the results will be printed after the process is complete (or not at all if exit code isn't returned).
I've browsed SO for answers to this, and tried setting bufsize=1, spawning threads that handle the reading, using filedescriptors, etc. None seem to work. I might have to use the module pexpect
but I'm not sure how it works yet.
I have also tried
def enqueue_output(self, out, queue):
for line in iter(out.readline, b''):
queue.put([line])
out.close()
To put the data on the queue, but since out.readline
seems to block, the result will be the same.
In short: How do I make the subprocess output available to me at the time of print? It prints chunks of 1-10 lines at a time, however these are returned to me when the process completes, separated by newlines as well..
Related:
Python subprocess readlines() hangs
Python: read streaming input from subprocess.communicate()
Non-blocking read on a subprocess.PIPE in python