We have some proto files for gRPC in a repo and I read that it is not good to commit generated code. So I figured I need to have the generation as part of the package installation (e.g. setuptools, setup.py)
However, to generate gRPC code, you need to first install the package by running pip install grpcio-tools
according to the docs. But the purpose of setup.py is to automatically pull down dependencies like grpcio-tools.
So is there a best-practice for doing this? As in, how to generate code that depends on another python package from within setuptools? Am I better off just create a separate build.sh
script that manually pip-installs and generates the code? Or should I expect users of the package to already have grpcio-tools installed?
As far as I know, the "current" best practice is:
Executing "pip install ." is almost equivalent to perform "pip install -r requirements.txt" + "python setup.py build" + "python setup.py install".
This is a custom command that generates python sources from proto files:
that is invoked customizing build_py command, like this:
Note the import inside the run method. It seems that pip run setup.py several times, both before and after having installed requirements. So if you have the import on top of file, the build fails.
Along with @makeroo approach, alternative way is to execute
grpc_tools
module as a subprocess.The benefit of this approach is to receive a generation result for sure;
0
is a success and1
for error.Above code will create generated python files to
proto
directory where.proto
files reside.