Take the following trivial package which contains setup_requires:
from setuptools import setup
setup(name='my_package', setup_requires=['cython'])
Assuming I have done the following to build it to a source distribution:
$ python setup.py sdist
And downloaded the source distribution for Cython
$ pip install --download ./dist/ --no-use-wheel Cython
So now I have:
$ ls dist/
my_package-0.0.0.tar.gz
Cython-0.21.1.tar.gz
What I'd like to be able to do is install the package on a network-isolated machine using some combination of --find-links
, etc.
I'd imagine I could do something like
pip install --no-index --find-links="file:///$(pwd)/dists" dist/my_package-0.0.0.tar.gz
However I get an error that looks like this:
No local packages or download links found for cython
(Full text here: http://paste.pound-python.org/show/IxmzEEfQ5yZRU45i2FBM/ )
What I've tried unsuccessfully:
Setting the following
[easy_install]
allow_hosts = ''
find_links = file:///$(pwd)/emr-sdists
in:
/usr/lib/python2.6/distutils/distutils.cfg
~/.pydistutils.cfg
./setup.cfg
I'm currently using the --net none
setting of docker to help debug this if it makes it easier for you to get to a reproduction.
The problem (besides that your
--find-links
is typoed asdists
instead ofdist
) is that the first thing pip does to install a package is runpython setup.py egg_info
, without bothering to pass along any of the package-finding information. Pip doesn't actually want setuptools to install any dependencies! It wants setuptools to spit out the dependencies asegg_info
so pip can read them and go fetch them itself.But dependencies in
setup_requires
are always installed on any invocation ofsetup.py
. I'd go so far as to say thatsetup_requires
is completely incompatible with pip.The alternative is... to just put your build code in the
build
step. Pip won't try to build your package until all of its dependencies are installed anyway.