I want to write a function that will execute a shell command and return its output as a string, no matter, is it an error or success message. I just want to get the same result that I would have gotten with the command line.
What would be a code example that would do such a thing?
For example:
def run_command(cmd):
# ??????
print run_command('mysqladmin create test -uroot -pmysqladmin12')
# Should output something like:
# mysqladmin: CREATE DATABASE failed; error: 'Can't create database 'test'; database exists'
Something like that:
Note, that I'm redirecting stderr to stdout, it might not be exactly what you want, but I want error messages also.
This function yields line by line as they come (normally you'd have to wait for subprocess to finish to get the output as a whole).
For your case the usage would be:
The answer to this question depends on the version of Python you're using. The simplest approach is to use the
subprocess.check_output
function:check_output
runs a single program that takes only arguments as input.1 It returns the result exactly as printed tostdout
. If you need to write input tostdin
, skip ahead to therun
orPopen
sections. If you want to execute complex shell commands, see the note onshell=True
at the end of this answer.The
check_output
function works on almost all versions of Python still in wide use (2.7+).2 But for more recent versions, it is no longer the recommended approach.Modern versions of Python (3.5 or higher):
run
If you're using Python 3.5 or higher, and do not need backwards compatibility, the new
run
function is recommended. It provides a very general, high-level API for thesubprocess
module. To capture the output of a program, pass thesubprocess.PIPE
flag to thestdout
keyword argument. Then access thestdout
attribute of the returnedCompletedProcess
object:The return value is a
bytes
object, so if you want a proper string, you'll need todecode
it. Assuming the called process returns a UTF-8-encoded string:This can all be compressed to a one-liner:
If you want to pass input to the process's
stdin
, pass abytes
object to theinput
keyword argument:You can capture errors by passing
stderr=subprocess.PIPE
(capture toresult.stderr
) orstderr=subprocess.STDOUT
(capture toresult.stdout
along with regular output). When security is not a concern, you can also run more complex shell commands by passingshell=True
as described in the notes below.This adds just a bit of complexity, compared to the old way of doing things. But I think it's worth the payoff: now you can do almost anything you need to do with the
run
function alone.Older versions of Python (2.7-3.4):
check_output
If you are using an older version of Python, or need modest backwards compatibility, you can probably use the
check_output
function as briefly described above. It has been available since Python 2.7.It takes takes the same arguments as
Popen
(see below), and returns a string containing the program's output. The beginning of this answer has a more detailed usage example.You can pass
stderr=subprocess.STDOUT
to ensure that error messages are included in the returned output -- but don't passstderr=subprocess.PIPE
tocheck_output
. It can cause deadlocks. When security is not a concern, you can also run more complex shell commands by passingshell=True
as described in the notes below.If you need to pipe from
stderr
or pass input to the process,check_output
won't be up to the task. See thePopen
examples below in that case.Complex applications & legacy versions of Python (2.6 and below):
Popen
If you need deep backwards compatibility, or if you need more sophisticated functionality than
check_output
provides, you'll have to work directly withPopen
objects, which encapsulate the low-level API for subprocesses.The
Popen
constructor accepts either a single command without arguments, or a list containing a command as its first item, followed by any number of arguments, each as a separate item in the list.shlex.split
can help parse strings into appropriately formatted lists.Popen
objects also accept a host of different arguments for process IO management and low-level configuration.To send input and capture output,
communicate
is almost always the preferred method. As in:Or
If you set
stdin=PIPE
,communicate
also allows you to pass data to the process viastdin
:Note Aaron Hall's answer, which indicates that on some systems, you may need to set
stdout
,stderr
, andstdin
all toPIPE
(orDEVNULL
) to getcommunicate
to work at all.In some rare cases, you may need complex, real-time output capturing. Vartec's answer suggests a way forward, but methods other than
communicate
are prone to deadlocks if not used carefully.As with all the above functions, when security is not a concern, you can run more complex shell commands by passing
shell=True
.Notes
1. Running shell commands: the
shell=True
argumentNormally, each call to
run
,check_output
, or thePopen
constructor executes a single program. That means no fancy bash-style pipes. If you want to run complex shell commands, you can passshell=True
, which all three functions support.However, doing so raises security concerns. If you're doing anything more than light scripting, you might be better off calling each process separately, and passing the output from each as an input to the next, via
Or
The temptation to directly connect pipes is strong; resist it. Otherwise, you'll likely see deadlocks or have to do hacky things like this.
2. Unicode considerations
check_output
returns a string in Python 2, but abytes
object in Python 3. It's worth taking a moment to learn about unicode if you haven't already.