可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
Is it possible to execute an arbitrary number of commands in sequence using the same subprocess command?
I need each command to wait for the previous one to complete before executing and I need them all to be executed in the same session/shell. I also need this to work in Python 2.6, Python 3.5. I also need the subprocess command to work in Linux, Windows and macOS (which is why I'm just using echo
commands as examples here).
See non-working code below:
import sys
import subprocess
cmds = ['echo start', 'echo mid', 'echo end']
p = subprocess.Popen(cmd=tuple([item for item in cmds]),
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in iter(p.stdout.readline, b''):
sys.stdout.flush()
print(">>> " + line.rstrip())
If this is not possible, which approach should I take in order to execute my commands in synchronous sequence within the same session/shell?
回答1:
If you want to execute many commands one after the other in the same session/shell, you must start a shell and feed it with all the commands, one at a time followed by a new line, and close the pipe at the end. It makes sense if some commands are not true processes but shell commands that could for example change the shell environment.
Example using Python 2.7 under Windows:
encoding = 'latin1'
p = subprocess.Popen('cmd.exe', stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
for cmd in cmds:
p.stdin.write(cmd + "\n")
p.stdin.close()
print p.stdout.read()
To have this code run under Linux, you would have to replace cmd.exe
with /bin/bash
and probably change the encoding to utf8.
For Python 3, you would have to encode the commands and probably decode their output, and to use parentheses with print.
Beware: this can only work for little output. If there was enough output to fill the pipe buffer before closing the stdin pipe, this code would deadlock. A more robust way would be to have a second thread to read the output of the commands to avoid that problem.
回答2:
One possible solution, looks like its running in same shell:
subprocess.Popen('echo start;echo mid;echo end', shell=True)
Note - If you pass your command as a string then shell has to be True
Note - This is working on linux only, you may have to find something similar way out on windows.
Hope it will help.
From python doc -
On Unix with shell=True, the shell defaults to /bin/sh. If args is a
string, the string specifies the command to execute through the shell.
This means that the string must be formatted exactly as it would be
when typed at the shell prompt.
回答3:
This is similar to the answer posted by Serge Ballesta, but not quite. Use his for asynchronous execution, where you don't care about the results. Use mine for synchronous processing and result gathering. Like his answer, I'm showing the Windows solution here - run a bash process in Linux rather than cmd in Windows.
from subprocess import Popen, PIPE
process = Popen( "cmd.exe", shell=False, universal_newlines=True,
stdin=PIPE, stdout=PIPE, stderr=PIPE )
out, err = process.communicate( commands )
USAGE DETAILS: The commands
argument being passed here to the process.communicate
method is a newline delimited string. If, for example you just read a batch file contents into a string, you could run it this way because it would already have the newlines. Important: your string must end in a newline "\n"
. If it does not, that final command will fail to execute. Just like if you typed it into your command prompt but didn't hit enter
at the end. You will however see a mysterious More?
line in the end of the stdout returned. (that's the cause if you encounter this).
process.communicate
runs synchronously by definition, and returns the stdout and stderr messages (if you directed them to subprocess.PIPE
in your Popen constructor).
When you create a cmd.exe
process in this way, and pass a string to it, the results will be exactly like if you were to open a command prompt window and entered commands into. And I mean that quite literally. If you test this, you will see that the stdout which is returned contains your commands. (It does NOT matter if you include an @echo off
like if executing a batch file).
Tips for those who care about "clean" stdout results:
@echo off
will not suppress your commands from appearing in this returned string, but it does remove extra newlines that find their way in there otherwise. (universal_newlines=True strips an another set of those)
Including an @
symbol prefix to your commands allows them to still execute. In a "normal" batch process that's the line-by-line way to "hide" your commands. In this context, it's a safe an easy marker by which you can find stdout lines you want to remove. (if one were so inclined)
The cmd.exe "header" will appear in your output (which says the version of Windows etc.). Since you probably want to start your set of commands with @echo off
, to cut out the extra newlines, that is also a great way to find where the header lines stopped and your commands/results began.
Finally, to address concerns about "large" output filling the pipes and causing you problems - first I think you need a HUGE amount of data coming back for that to be an issue - more than most people will encounter in their use cases. Second, if it really is a concern just open a file for writing and pass that file handle (the reference to the file object) to stdout/err instead of PIPE
. Then, do whatever you want with the file you've created.
回答4:
Here is a function (and main to run it) that I use. I would say that you can use it for your problem. And it is flexible.
# processJobsInAList.py
# 2016-09-27 7:00:00 AM Central Daylight Time
import win32process, win32event
def CreateMyProcess2(cmd):
''' create process width no window that runs a command with arguments
and returns the process handle'''
si = win32process.STARTUPINFO()
info = win32process.CreateProcess(
None, # AppName
cmd, # Command line
None, # Process Security
None, # Thread Security
0, # inherit Handles?
win32process.NORMAL_PRIORITY_CLASS,
None, # New environment
None, # Current directory
si) # startup info
# info is tuple (hProcess, hThread, processId, threadId)
return info[0]
if __name__ == '__main__' :
''' create/run a process for each list element in "cmds"
output may be out of order because processes run concurrently '''
cmds=["echo my","echo heart","echo belongs","echo to","echo daddy"]
handles = []
for i in range(len(cmds)):
cmd = 'cmd /c ' + cmds[i]
handle = CreateMyProcess2(cmd)
handles.append(handle)
rc = win32event.WaitForMultipleObjects( handles, 1, -1) # 1 wait for all, -1 wait infinite
print 'return code ',rc
output:
heart
my
belongs
to
daddy
return code 0
UPDATE: If you want to run the same process, which will serialize things for you:
1) Remove line: handles.append(handle)
2) Substitute the variable "handle" in place of the list "handles" on the "WaitFor" line
3) Substitute WaitForSingleObject in place of WaitForMultipleObjects
回答5:
This one works in python 2.7 and should work also in windows. Probably some small refinement is needed for python >3.
The produced output is (using date and sleep it is easy to see that the commands are executed in row):
>>>Die Sep 27 12:47:52 CEST 2016
>>>
>>>Die Sep 27 12:47:54 CEST 2016
As you see the commands are executed in a row.
import sys
import subprocess
import shlex
cmds = ['date', 'sleep 2', 'date']
cmds = [shlex.split(x) for x in cmds]
outputs =[]
for cmd in cmds:
outputs.append(subprocess.Popen(cmd,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT).communicate())
for line in outputs:
print ">>>" + line[0].strip()
This is what I obtain merging with @Marichyasana answer:
import sys
import os
def run_win_cmds(cmds):
@Marichyasana code (+/-)
def run_unix_cmds(cmds):
import subprocess
import shlex
cmds = [shlex.split(x) for x in cmds]
outputs =[]
for cmd in cmds:
outputs.append(subprocess.Popen(cmd,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT).communicate())
rc = ''
for line in outputs:
rc += line[0].strip()+'\n'
return rc
cmds = ['date', 'sleep 2', 'date']
if os.name == 'nt':
run_win_cmds(cmds)
elif os.name == 'posix':
run_unix_cmds(cmds)
Ask is this one do not fit your needs! ;)