I have a docker container that is running a python script: waiting for input requests and processing data accordingly.
Since I am using docker for development, I would like that, whenever I change the source code of that python file (in my machine, not the container), the container would stop the python script and relaunch it with the new code. Because right now I have to manually stop the container and relaunch it. I could also monitor the file changes on my side (rather than inside the container) but I would like to avoid that and do it within the container itself.
I am using docker-compose's volumes
option to share the source code between my FS and the container's.
To monitor the file changes, I've been trying to use the watchmedo shell utility from the the watchdog python module. I just have this weird problem that I can't notice the file changes of that python source file unless I am editing it from the inside of the container and not in my local FS, even though they are mounted with the volumes
option.
I get the feeling that this is something to do with how docker works and maybe the volumes thing too. I've been trying to read up on it online, but didn't get much luck. Any ideas? I'm totally stuck!
EDIT: Here's a gif that better explains it. The top to panes are connected to the same container and the bottom two to my local machine. All the panes are pointed to the same folder.
You could have your container run something like this ( need inotify installed ):
Basically wait for changes on the file, kill python on the machine, run script again. If the python script is not running in the background/deamonized in any way you could use an & like so: the
python /path/to/python/script &
Put this into run.sh, add something like this to your Dockerfile
... and you should be good.