So I am using subprocess
to spawn a long running process through the web interface using Django. Now if a user wants to come back to the page I would like to give him the option of terminating the subprocess
at a later stage.
How can do this? I implemented the same thing in Java and made a global singleton ProcessManager dictionary to store the Process Object in Memory. Can I do something similar in Python?
EDIT
Yes Singletons and a hash of ProcessManager is the way of doing it cleanly. Emmanuel's code works perfectly fine with a few modifications.
Thanks
I think an easy way to implement Singleton pattern in python is via class attributes:
import subprocess
class ProcessManager(object):
__PROCESS = None;
@staticmethod
def set_process(args):
# Sets singleton process
if __PROCESS is None:
p = subprocess.Popen(args)
ProcessManager.__PROCESS = p;
# else: exception handling
@staticmethod
def kill_process():
# Kills process
if __PROCESS is None:
# exception handling
else:
ProcessManager.__PROCESS.kill()
Then you can use this class via:
from my_module import ProcessManager
my_args = ...
ProcessManager.set_process(my_args)
...
ProcessManager.kill_process()
Notes:
- the
ProcessManager
is in charge of creating the process, to be symmetrical with its ending
- I don't have enough knowledge in multi-threading to know if this works in multi-threading mode
You can use the same technique in Python as you did in Java, that is store the reference to the process in a module variable or implement a kind of a singleton.
The only problem you have as opposed to Java, is that Python does not have that rich analogy to the Servlet specification, and there is no interface to handle the application start or finish. In most cases you should not be worried how many instances of your application are running, because you fetch all data from a persistent storage. But in this case you should understand how your application is deployed.
If there is a single long running instance of your application (a FastCGI instance, for example, or a single WSGI application on cherrypy), you can isolate the process handling functionality in a separate module and load it when the module is imported (any module is imported only once within an application). If there are many instances (many FastCGI instances, or plain CGI-scripts), you should better detach child processes and keep their ids in a persistent storage (in a database, or files) and intersect them with the the list of currently running processes on demand.