I am looking for the way to call shell scripts from python and write their stdout and stderr to file using logging. Here is my code:
import logging
import tempfile
import shlex
import os
def run_shell_command(command_line):
command_line_args = shlex.split(command_line)
logging.info('Subprocess: \"' + command_line + '\"')
process_succeeded = True
try:
process_output_filename = tempfile.mktemp(suffix = 'subprocess_tmp_file_')
process_output = open(process_output_filename, 'w')
command_line_process = subprocess.Popen(command_line_args,\
stdout = process_output,\
stderr = process_output)
command_line_process.wait()
process_output.close()
process_output = open(process_output_filename, 'r')
log_subprocess_output(process_output)
process_output.close()
os.remove(process_output_filename)
except:
exception = sys.exc_info()[1]
logging.info('Exception occured: ' + str(exception))
process_succeeded = False
if process_succeeded:
logging.info('Subprocess finished')
else:
logging.info('Subprocess failed')
return process_succeeded
And I am sure that there is the way to do it without creating temporary file to store process output. Any ideas?
You could try to pass the pipe directly without buffering the whole subprocess output in memory:
where
log_subprocess_output()
could look like:You simply have to check for the documentation of
Popen
, in particular aboutstdout
andstderr
:So you can see that you can either use a file object, or the
PIPE
value. This allows you to use thecommunicate()
method to retrieve the output:I'd rewrite your code as: