I have a multi-process python application (processes are spawned by uwsgi) that needs to store variables in RAM, then read and update those variables from several different processes. I know there are a lot caching options available, but all the ones I've found can only store strings. Is it possible for different python processes to access the same virtual memory, and thus share data without ever converting it or even copying it?
相关问题
- how to define constructor for Python's new Nam
- streaming md5sum of contents of a large remote tar
- How to get the background from multiple images by
- Evil ctypes hack in python
- Correctly parse PDF paragraphs with Python
Besides POSH, Python Shared Objects, which at least does a part of what you want to do (place Python Objects in SvsV-IPC shared memory and modify them from multiple processes) and could serve as a starting point on developing your own extension module to fit the need you have with wsgi-spawned server processes, there's not much else in the Python world (that I know of...) that doesn't rely on pickling/unpickling objects when sharing them between processes.
One other thing that also comes to mind is Pyro, which does sharing over an arbitrary network connections between processes (so, can also share via Unix-Domain sockets), and is in my own experience more flexible than what the builtin multiprocessing can offer for (proxy) object management.
What you also might have a look at is whether you can get the webserver that's driving your WSGI-application to not fork processes, but rather to use threading; this way, you'd simply need to use standard Python global data for the shared object cache which you can then access from all spawned WSGI handler threads. A threaded WSGI-server is for example the CherryPy-builtin wsgiserver, which I'm using for a project which has exactly the demand you're having. mod_wsgi also works for your context in case you configure Apache with the worker MPM (so that Apache threads, and doesn't fork new processes).
If all of those aren't an option, how about extracting the actual processing you're now doing in the webserver to an external process, which the webpages communicate with via some form of RPC mechanism to push work requests and pull data? The "backend" processing server can then be a simple multithreaded Python process, which offers an XML-RPC interface through the standard library SimpleXMLRPCServer or some thing similar.