I'm on Mac OS X and I've heard that to avoid global installation of packages (using sudo) that might cause problems with the python files that OS X uses, the path to install python packages must be different than that of OS X.
Currently python executables are installed in :
/usr/local/bin/
Pip installs modules over here :
/usr/local/lib/python2.7/site-packages
Python is used from here :
/usr/local/bin/python
Are these paths safe?
If you are on OS X, you should also have Python in /usr/bin
:
$ which -a python
/usr/local/bin/python
/usr/bin/python
If you are using brew
, the first python
should be a symlink:
$ ls -hl $(which python)
lrwxr-xr-x 1 user admin 34B Jun 23 16:53 /usr/local/bin/python -> ../Cellar/python/2.7.11/bin/python
If you are not using brew
, you will have to explain to us how you installed a second version of python
.
You should also have at least two site-packages
:
$ find /usr -name 'site-packages'
/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages
/usr/local/lib/python2.7/site-packages
If you installed python
using brew
, you should also have pip
:
$ which pip
/usr/local/bin/pip
You should probably upgrade that to the latest pip
:
$ pip install --upgrade pip
It should be safe to install python
packages using /usr/local/bin/pip
because they will be installed in /usr/local/lib/python2.7/site-packages
. The /usr/local
path is specifically for local software. Also, brew
installs its files to /usr/local
, so if you are using brew
, you are already installing files there.
I am not sure why some folks say not to install any packages globally. I have never seen a reference that explained why this was a bad idea. If multiple users need the same package, it makes more sense to install it globally.
When I first started using virtualenv
, it did not always work the way I expected it to. I had a machine with multiple users that needed requests
, and because of problems with virtualenv
, I wound up installing it globally using pip
.
Both virtualenv
and pip
have improved a lot since I first started using them and I can see how using them can prevent some problems. If you are developing new software that needs the latest version of a package, virtualenv
allows you to install the package without affecting the rest of the system. However, I still do not see why it is a bad idea to install packages globally.
You shouldn't be mucking about with Python package paths, and you shouldn't be installing Python packages globally at all. Use a virtualenv for each project, and let pip install the libraries locally inside the virtualenv.
I suggest you to make use of virtual environments, you can learn more about them here: http://docs.python-guide.org/en/latest/dev/virtualenvs/
to sum up
- Create a virtual environment (venv):
$ virtualenv venv
This creates a copy of Python in whichever directory you ran the command in placing it in a folder named venv.
- Activate the virtual environment:
$ source venv/bin/activate
- Install packages with pip:
$ pip install requests
- When you are done deactivate the virtual environment:
$ deactivate
There are a number of options you can take, the easiest (as others have suggested) is virtualenv
. Hopefully that's already installed if not, this is one of the few modules you should install globally. If you have Python 3.4+, you "should" have venv
module (which is similar to virtualenv, but it's maintained by the Python team).
python2
virtualenv ~/.py-venvs/python2
python3
python3 -m venv ~/.py-venvs/python3
You could install modules to the local user using pip
, but I'm not sure how well this is supported these days though:
pip install --user requests
You could also append directories to the $PYTHONPATH
bash variable, but this should only be done as a last resort and under parental supervision :D. Try the other methods before this one.