I'm currently building a docker image and running the container to run some tests in it on for a Python application I'm working on. Currently the Dockerfile copies the files over from the host machine, sets the working directory to those copied files, runs a sudo-apt get and installs pip, and finally runs the tests from setup.py. The Dockerfile can be seen below.
`FROM ubuntu
ADD . /home/dev/ProjectName
WORKDIR /home/dev/ProjectName
RUN apt-get update && \
apt-get install -y python3-pip && \
python3 setup.py test
`
I was curious if there was a more conventional way to avoid having to run the apt-get and apt-get install pip every time I'd like to run a test. The main idea I had was to build an image with pip already on it, and then build this image from that one.
Docker builds using cached layers if it can. By adding files you have changed it invalidates the cache for all subsequent rules. Put the apt commands first and those will only be run the first time you build. See this blog for more info.