I have a TF model that works well, constructed with Python and TFlearn. Is there a way to run this model on another system without installing Tensorflow on it? It is already pre-trained, so I just need to run data through it.
I am aware of tfcompile (Thread here), but it seems quite complex to set up. Are there any alternatives?
Is there a way to run this model on another system without installing Tensorflow on it? It is already pre-trained, so I just need to run data through it.
Yes
After you have your model trained . Use tf.python.tools.freeze_graph
and tf.python.tools.optimize_for_inference_lib
to freeze and optimize the model for inference on other devices like Android.
The output of the above will be
- Frozen graph protobuf file (.pb)
- Optimized graph protobuf file (.pb)
[These functions will converts all the Variables of the Model to Constant Operations and exports to a protobuf file]
Use the optimized graph protobuf file and load it using Inference methods available in Java and other Tensorflow APIs. Pass the data and get the output.
[ Note for this you didn't installed complete Tensorflow but you only needed the inference library]
A Simple example is demonstrated here :
https://omid.al/posts/2017-02-20-Tutorial-Build-Your-First-Tensorflow-Android-App.html
It is for Android but procedure should be same for Java.
For C++ :click here