I'm trying to build a project which includes a few open source third party libraries, but I want to bundle my distribution with said libraries - because I expect my users to want to use my project without an Internet connection. Additionally, I'd like to leave their code 100% untouched and even leave their directory structure untouched if I can. My approach so far has been to extract the tarballs and place the entire folder in MyProject/lib
, but I've had to put __init__.py
in every sub-directory to be able to reference the third-party code in my own modules.
What's the common best practice for accomplishing this task? How can I best respect the apps' developers by retaining their projects' structure? How can I make importing these libraries in my code less painful?
I'm new to making a distributable project in Python, so I've no clue if there's something I can do in __init__.py
or in setup.py
to keep myself from having to type from lib.app_name.app_name.app_module include *
and whatnot.
For what it's worth, I will be distributing this on OS X and possibly *nix. I'd like to avoid using another library (e.g. setuptools
) to accomplish this. (Weirdly, it seems to already be installed on my system, but I didn't install it, so I've no idea what's going on.)
I realize that this question seems to be a duplicate of this one, but I don't think it is because I'm asking for the best practice for the "distribute-the-third-party-code-with-your-own" approach. Please forgive me if I'm asking a garbage question.
buildout is good solution for building and distributing Python software.
You could have a look at the inner workings of virtualenv to get some inspiration how go about this. Maybe you can resuse code from there.