I have an environment.yml
in my applications folder
I have this in my dockerfile:
RUN conda env create
RUN source activate myenvfromymlfile
When I run the container though the env is not activated. If I do conda env list
Is see /opt/conda
is activated:
root@9c7181cf86aa:/app# conda env list
# conda environments:
#
myenvfromymlfile /opt/conda/envs/myenvfromymlfile
root * /opt/conda
If I attach to the container I can manually run source activate myenvfromymlfile
and it works, but why doesn't that work in the RUN directive??
In examples, I see this often in dockerfiles that require conda:
CMD [ "source activate your-environment && exec python application.py" ]
Can someone explain why it is necessary to use && to make it a single command? And why running "source activate" in a RUN directive does not work? I want to have my dockerfile look like this:
RUN conda env create
RUN source activate myenvfromymlfile
ENTRYPOINT ["python"]
CMD ["application.py"]
Consider the below Dockerfile
Statement #1
conda env create
. Create the environment and changes files on the disk.Statement #2
source activate myenvfromymlfile
. Loads some stuff in the bash sessions. No disk changes done hereStatement #3 and #4 specifies what happens when you run the container
So now when you run the container. Anything that you did in step#2 is not there, because a shell was launched to run step #2, when it completed the shell was closed. Now when you run the image a new shell is started and it is brand new shell with now no knowledge that in past inside your dockerfile you ran
source activate myenvfromymlfile
Now you want to run this
application.py
in the environment you created. The default shell of docker issh -c
. So when you setCMD
as belowThe final command executed at start of container becomes
Which activates the environment in current shell and then runs your program.