I have a Python project consisting of a Jupyter notebook, several scripts in a bin
directory and modules in a src
directory, with dependencies in a Pipfile
:
myproject
├── myproject.ipynb
├── Pipfile
├── Pipfile.lock
├── bin
│ ├── bar.py
│ └── foo.py
└── src
├── baz.py
└── qux.py
The scripts foo.py
and bar.py
use the standard shebang
#!/usr/bin/env python
and can be run with pipenv shell
:
mymachine:myproject myname$ pipenv shell
(myproject-U308romt) bash-3.2$ bin/foo.py
foo
However, I can't easily access the modules in src
from the scripts. If I add
import src.baz as baz
to foo.py
, I get:
ModuleNotFoundError: No module named 'src'
One solution I tried is to add a .env
file under myproject
:
PYTHONPATH=${PYTHONPATH}:${PWD}
This works thanks to pipenv
's automatic loading of .env
, but checking the .env
file into the git distribution of the project would collide with the traditional use of .env
to store secrets such as passwords -- in fact, my default .gitignore
for Python projects already excludes .env
for just this reason.
$ git add .env
The following paths are ignored by one of your .gitignore files:
.env
Use -f if you really want to add them.
Alternatively, I could move src
under bin
, but then the Jupyter notebook would have to reference the modules as bin.src.baz
etc., which is also a hassle.
My current workaround is just to add a symlink:
myproject
├── Pipfile
├── Pipfile.lock
├── bin
│ ├── bar.py
│ ├── foo.py
│ └── src -> ../src
└── src
├── baz.py
└── qux.py
This works, and I suppose has the benefit of being transparent, but it seems like there should be some way to leverage pipenv
to solve the same problem.
Is there a portable, distributable way to put these modules on the search path?