Wrap subprocess' stdout/stderr

2019-01-14 09:23发布

I'd like to both capture and display the output of a process that I invoke through Python's subprocess.

I thought I could just pass my file-like object as named parameter stdout and stderr

I can see that it accesses the filenoattribute - so it is doing something with the object. However, the write() method is never invoked. Is my approach completely off or am I just missing something?

class Process(object):
    class StreamWrapper(object):
        def __init__(self, stream):
            self._stream = stream
            self._buffer = []
        def _print(self, msg):
            print repr(self), msg
        def __getattr__(self, name):
            if not name in ['fileno']:
                self._print("# Redirecting: %s" % name)
            return getattr(self._stream, name)
        def write(self, data):
            print "###########"
            self._buffer.append(data)
            self._stream.write(data)
            self._stream.flush()
        def getBuffer(self):
            return self._buffer[:]
    def __init__(self, *args, **kwargs):
        print ">> Running `%s`" % " ".join(args[0])
        self._stdout = self.StreamWrapper(sys.stdout)
        self._stderr = self.StreamWrapper(sys.stderr)
        kwargs.setdefault('stdout', self._stdout)
        kwargs.setdefault('stderr', self._stderr)
        self._process = subprocess.Popen(*args, **kwargs)
        self._process.communicate()

Update:

Something I'd like to work as well, is the ANSI control characters to move the cursor and override previously output stuff. I don't know whether that is the correct term, but here's an example of what I meant: I'm trying to automate some GIT stuff and there they have the progress that updates itself without writing to a new line each time.

Update 2

It is important to me, that the output of the subprocess is displayed immediately. I've tried using subprocess.PIPE to capture the output, and display it manually, but I was only able to get it to display the output, once the process had completed. However, I'd like to see the output in real-time.

4条回答
时光不老,我们不散
2楼-- · 2019-01-14 10:02

Is there a reason you have a class inside a class? And stdout and stderr can take any file like line for example try. So simply passing an open file type or stringIO should be enough to modify the stream

import sys
sys.stdout = open('test.txt','w')
print "Testing!"
sys.stdout.write('\nhehehe')
sys.stdout = sys.__stdout__
sys.exit(0)
查看更多
The star\"
3楼-- · 2019-01-14 10:09

Have a look here.

p = subprocess.Popen(cmd,
                 shell=True,
                 bufsize=64,
                 stdin=subprocess.PIPE,
                 stderr=subprocess.PIPE,
                 stdout=subprocess.PIPE)
查看更多
太酷不给撩
4楼-- · 2019-01-14 10:19

A file-like is not close enough. It must be an actual file with an actual file descriptor. Use subprocess's support for pipes and read from them as appropriate.

查看更多
Summer. ? 凉城
5楼-- · 2019-01-14 10:23

Stdin, stdout and stderr of a process need to be real file descriptors. (That is actually not a restriction imposed by Python, but rather how pipes work on the OS level.) So you will need a different solution.

If you want to track both stdout an stderr in real time, you will need asynchronous I/O or threads.

  • Asynchronous I/O: With the standard synchronous (=blocking) I/O, a read to one of the streams could block, disallowing access to the other one in real time. If you are on Unix, you can use non-blocking I/O as described in this answer. However, on Windows you will be out of luck with this approach. More on asynchronous I/O in Python and some alternatives are shown in this video.

  • Threads: Another common way to deal with this problem is to create one thread for each file descriptor you want to read from in real time. The threads only handle the file descriptor they are assinged to, so blocking I/O won't harm.

查看更多
登录 后发表回答