It seems there are already quite some questions here about relative import in python 3, but after going through many of them I still didn't find the answer for my issue. so here is the question.
I have a package shown below
package/
__init__.py
A/
__init__.py
foo.py
test_A/
__init__.py
test.py
and I have a single line in test.py:
from ..A import foo
now, I am in the folder of package
, and I run
python -m test_A.test
I got message
"ValueError: attempted relative import beyond top-level package"
but if I am in the parent folder of package
, e.g., I run:
cd ..
python -m package.test_A.test
everything is fine.
Now my question is:
when I am in the folder of package
, and I run the module inside the test_A sub-package as test_A.test
, based on my understanding, ..A
goes up only one level, which is still within the package
folder, why it gives message saying beyond top-level package
. What is exactly the reason that causes this error message?
If someone's still struggling a bit after the great answers already provided, consider checking out this:
https://www.daveoncode.com/2017/03/07/how-to-solve-python-modulenotfound-no-module-named-import-error/
Essential quote from the above site:
It's pretty obvious that it has to be this way, thinking on it after the fact. I was trying to use the sys.path.append('..') in my tests, but ran into the issue posted by OP. By adding the import and sys.path defintion before my other imports, I was able to solve the problem.
Try this. Worked for me.
Assumption:
If you are in the
package
directory,A
andtest_A
are separate packages.Conclusion:
..A
imports are only allowed within a package.Further notes:
Making the relative imports only available within packages is useful if you want to force that packages can be placed on any path located on
sys.path
.EDIT:
The current working directory is usually located in sys.path. So, all files there are importable. This is behavior since Python 2 when packages did not yet exist. Making the running directory a package would allow imports of modules as "import .A" and as "import A" which then would be two different modules. Maybe this is an inconsistency to consider.
None of these solutions worked for me in 3.6, with a folder structure like:
My goal was to import from module1 into module2. What finally worked for me was, oddly enough:
Note the single dot as opposed to the two-dot solutions mentioned so far.
Edit: The following helped clarify this for me:
In my case, the working directory was (unexpectedly) the root of the project.
This same problem is noted in this question with a more coherent answer: Sibling package imports
Why doesn't it work? It's because python doesn't record where a package was loaded from. So when you do
python -m test_A.test
, it basically just discards the knowledge thattest_A.test
is actually stored inpackage
(i.e.package
is not considered a package). Attemptingfrom ..A import foo
is trying to access information it doesn't have any more (i.e. sibling directories of a loaded location). It's conceptually similar to allowingfrom ..os import path
in a file inmath
. This would be bad because you want the packages to be distinct. If they need to use something from another package, then they should refer to them globally withfrom os import path
and let python work out where that is with$PATH
and$PYTHONPATH
.When you use
python -m package.test_A.test
, then usingfrom ..A import foo
resolves just fine because it kept track of what's inpackage
and you're just accessing a child directory of a loaded location.Why doesn't python consider the current working directory to be a package? NO CLUE, but gosh it would be useful.
if you have an
__init__.py
in an upper folder, you can initialize the import asimport file/path as alias
in that init file. Then you can use it on lower scripts as: