I have two branches, Development and Production. Each has dependencies, some of which are different. Development points to dependencies that are themselves in development. Likewise for Production. I need to deploy to Heroku which expects each branch's dependencies in a single file called 'requirements.txt'.
What is the best way to organize?
What I've thought of:
- Maintain separate requirements files, one in each branch (must survive frequent merges!)
- Tell Heroku which requirements file I want to use (environment variable?)
- Write deploy scripts (create temp branch, modify requirements file, commit, deploy, delete temp branch)
A viable option today which didn't exist when the original question and answer was posted is to use pipenv instead of pip to manage dependencies.
With pipenv, manually managing two separate requirement files like with pip is no longer necessary, and instead pipenv manages the development and production packages itself via interactions on the command line.
To install a package for use in both production and development:
To install a package for the development environment only:
Via those commands, pipenv stores and manages the environment configuration in two files (Pipfile and Pipfile.lock). Heroku's current Python buildpack natively supports pipenv and will configure itself from Pipfile.lock if it exists instead of requirements.txt.
See the pipenv link for full documentation of the tool.
You can cascade your requirements files and use the "-r" flag to tell pip to include the contents of one file inside another. You can break out your requirements into a modular folder hierarchy like this:
The files' contents would look like this:
common.txt:
dev.txt:
prod.txt:
Outside of Heroku, you can now setup environments like this:
or
Since Heroku looks specifically for "requirements.txt" at the project root, it should just mirror prod, like this:
requirements.txt: