I have several services running in their own Docker containers. In my project I also have a lib
folder containing some small modules that all the services need.
What is the best way to include these modules into the Docker containers? Obviously third party modules I just use RUN pip install -r requirements.txt
, is there a similar way I can include my own modules?
I would suggest using data containers. You can achieve a similar approach the rails guys are doing when caching bundler gems between image builds and multiple containers:
In your docker-compose.yml
(I assume you are using docker compose) file you can add a data container that can be mounted into your other containers:
version: '2'
services:
web:
build: .
volumes:
- .:/app
- bundle:/bundle
volumes:
bundle: {}
By default, the container is created when it does not exist.
I'm not familiar with pip, but I think it could work in a similar way as with bundler: You set the pip installation path to your data container and get a persisted layer where pip will put all it's modules. When another container needs the same modules, just mount the data container.
Maybe you have to work out some issues, but I think the main idea could work in your case.
I ended up managing it out by mounting the lib
folder into the container using docker-compose
, like so:
version: '2'
services:
frontend_web:
build: .
volumes:
- ../../lib:/app/lib
I then just had to add /app/lib to the container's PYTHONPATH
and I could import any module from that folder.