I've tried reading through questions about sibling imports and even the package documentation, but I've yet to find an answer.
With the following structure:
├── LICENSE.md
├── README.md
├── api
│ ├── __init__.py
│ ├── api.py
│ └── api_key.py
├── examples
│ ├── __init__.py
│ ├── example_one.py
│ └── example_two.py
└── tests
│ ├── __init__.py
│ └── test_one.py
How can the scripts in the examples
and tests
directories import from the
api
module and be run from the commandline?
Also, I'd like to avoid the ugly sys.path.insert
hack for every file. Surely
this can be done in Python, right?
First, you should avoid having files with the same name as the module itself. It may break other imports.
When you import a file, first the interpreter checks the current directory and then searchs global directories.
Inside
examples
ortests
you can call:Seven years after
Since I wrote the answer below, modifying
sys.path
is still a quick-and-dirty trick that works well for private scripts, but there has been several improvementssetup.cfg
to store the metadata)-m
flag and running as a package works too (but will turn out a bit awkward if you want to convert your working directory into an installable package).sys.path
hacks for youSo it really depends on what you want to do. In your case, though, since it seems that your goal is to make a proper package at some point, installing through
pip -e
is probably your best bet, even if it is not perfect yet.Old answer
As already stated elsewhere, the awful truth is that you have to do ugly hacks to allow imports from siblings modules or parents package from a
__main__
module. The issue is detailed in PEP 366. PEP 3122 attempted to handle imports in a more rational way but Guido has rejected it one the account of(here)
Though, I use this pattern on a regular basis with
Here
path[0]
is your running script's parent folder anddir(path[0])
your top level folder.I have still not been able to use relative imports with this, though, but it does allow absolute imports from the top level (in your example
api
's parent folder).You don't need and shouldn't hack
sys.path
unless it is necessary and in this case it is not. Use:Run from the project directory:
python -m tests.test_one
.You should probably move
tests
(if they are api's unittests) insideapi
and runpython -m api.test
to run all tests (assuming there is__main__.py
) orpython -m api.test.test_one
to runtest_one
instead.You could also remove
__init__.py
fromexamples
(it is not a Python package) and run the examples in a virtualenv whereapi
is installed e.g.,pip install -e .
in a virtualenv would install inplaceapi
package if you have propersetup.py
.Just in case someone using Pydev on Eclipse end up here: you can add the sibling's parent path (and thus the calling module's parent) as an external library folder using Project->Properties and setting External Libraries under the left menu Pydev-PYTHONPATH. Then you can import from your sibling, e. g.
from sibling import some_class
.I don't yet have the comprehension of Pythonology necessary to see the intended way of sharing code amongst unrelated projects without a sibling/relative import hack. Until that day, this is my solution. For
examples
ortests
to import stuff from..\api
, it would look like:Tired on sys.path hacks?
There are plenty of
sys.path.append
-hacks available, but I found an alternative way of solving the problem in hand: The setuptools. I am not sure if there are edge cases which do not work well with this. The following is tested with Python 3.6.5, (Anaconda, conda 4.5.1), Windows 10 machine.Setup
The starting point is the file structure you have provided, wrapped in a folder called
myproject
.I will call the
.
the root folder, and in my example case it is located atC:\tmp\test_imports\
.api.py
As a test case, let's use the following ./api/api.py
test_one.py
Try to run test_one:
Also trying relative imports wont work:
Using
from ..api.api import function_from_api
would result intoSteps
1) Make a setup.py file to the root level directory
The contents for the
setup.py
would be*2) Use a virtual environment
If you are familiar with virtual environments, activate one, and skip to the next step. Usage of virtual environments are not absolutely required, but they will really help you out in the long run (when you have more than 1 project ongoing..). The most basic steps are (run in the root folder)
python -m venv venv
. /venv/bin/activate
(Linux) or./venv/Scripts/activate
(Win)To learn more about this, just Google out "python virtual env tutorial" or similar. You probably never need any other commands than creating, activating and deactivating.
Once you have made and activated a virtual environment, your console should give the name of the virtual environment in parenthesis
and your folder tree should look like this**
3) pip install your project in editable state
Install your top level package
myproject
usingpip
. The trick is to use the-e
flag when doing the install. This way it is installed in an editable state, and all the edits made to the .py files will be automatically included in the installed package.In the root directory, run
pip install -e .
(note the dot, it stands for "current directory")You can also see that it is installed by using
pip freeze
4) Add
myproject.
into your importsNote that you will have to add
myproject.
only into imports that would not work otherwise. Imports that worked without thesetup.py
&pip install
will work still work fine. See an example below.Test the solution
Now, let's test the solution using
api.py
defined above, andtest_one.py
defined below.test_one.py
running the test
* See the setuptools docs for more verbose setup.py examples.
** In reality, you could put your virtual environment anywhere on your hard disk.