Is there anything in python or linux what basically instructs the system to "install whatever is necessary". Basically I find it annoying to install python packages for each new script/system/server that I work on. Each time I end up doing a sudo pip or an apt-get or dnf anyway. Why not automate that within the script itself. Whereever a 'no package found' error crops up, pass the library name to the install statement. Is this there ?
PS: I know docker exists, but am talking at a python/script level or a direct system level for purely execution purposes.
Thanks
You can use setuptools to install dependencies automatically when you install your custom project on a new machine. Requirements file works just fine if all you want to do is to install a few
PyPI
packages.Here is a nice comparison between the two. From the same link you can see that if your project has two dependent packages
A
andB
, all you have to include in your setp.py file is a lineOf course,
setuptools
can do much more. You can include setups for external libraries (say C files), non PyPI dependencies, etc. The documentation gives a detailed overview on installing dependencies. There is also a really good tutorial on getting started with python packaging.From their example, a typical setup.py file would look like this.
In conclusion, it is so simple to get started with
setuptools
. This package can make it fairly easy to migrate your code to a new machine.The best way I know is, create a file
requirements.txt
list out all the packages name in it and install it using pipExample
requirements.txt
: