I know about the "Serving a Tensorflow Model" page
https://www.tensorflow.org/serving/serving_basic
but those functions assume you're using tf.Session() which the DNNClassifier tutorial does not... I then looked at the api doc for DNNClassifier and it has an export_savedmodel function (the export function is deprecated) and it seems simple enough but I am getting a "'NoneType' object is not iterable" error... which is suppose to mean I'm passing in an empty variable but I'm unsure what I need to change... I've essentially copied and pasted the code from the get_started/tflearn page on tensorflow.org but then added
directoryName = "temp"
def serving_input_fn():
print("asdf")
classifier.export_savedmodel(
directoryName,
serving_input_fn
)
just after the classifier.fit function call... the other parameters for export_savedmodel are optional I believe... any ideas?
Tutorial with Code: https://www.tensorflow.org/get_started/tflearn#construct_a_deep_neural_network_classifier
API Doc for export_savedmodel https://www.tensorflow.org/api_docs/python/tf/contrib/learn/DNNClassifier#export_savedmodel
There are two possible questions and answers possible. First you encounter a missing session for the DNNClassifier which uses the more higher level estimators API (as opposed to the more low level API's where you manipulate the ops yourself). The nice thing about tensorflow is that all high and low level APIs are more-or-less interoperable, so if you want a session and do something with that session, it is as simple as adding:
The you can start hooking in the remainder of the tutorial.
The second interpretation of your question is, what about the export_savedmodel, well actually export_savedmodel and the sample code from the serving tutorial try to achieve the same goal. When you are training your graph you set up some infrastructure to feed input to the graph (typically batches from a training dataset) however when you switch to 'serving' you will often read your input from somewhere else, and you need some separate infrastructure which replaces the input of the graph used for training. The bottomline is that the
serving_input_fn()
which you filled with a print should in essence return an input op. This is also said in the documentation:Hence instead of
print("asdf")
it should do something similar as adding an input chain (which should be similar to what builder.add_meta_graph_and_variables is also adding).Examples of serving_input_fn()'s can for example be found (in the cloudml sample)[https://github.com/GoogleCloudPlatform/cloudml-samples/blob/master/census/customestimator/trainer/model.py#L240]. Such as the following which serves input from JSON:
There are two kind of TensorFlow applications:
tf.Session()
are functions from "low level" Tensorflow examples, andI'm going to explain how to export "high level" Tensorflow models (using
export_savedmodel
).The function
export_savedmodel
requires the argumentserving_input_receiver_fn
, that is a function without arguments, which defines the input from the model and the predictor. Therefore, you must create your ownserving_input_receiver_fn
, where the model input type match with the model input in the training script, and the predictor input type match with the predictor input in the testing script.On the other hand, if you create a custom model, you must define the
export_outputs
, defined by the functiontf.estimator.export.PredictOutput
, which input is a dictionary that define the name that has to match with the name of the predictor output in the testing script.For example:
TRAINING SCRIPT
TESTING SCRIPT
(Code tested in Python 3.6.3, Tensorflow 1.4.0)
If you try to use
predictor
with tensorflow > 1.6 you can get this Error :Here is working example which is tested on 1.7.0 :
SAVING :
First you need to define features length in dict format like this:
Then you have to build a function which have placeholder with same shape of features and return using tf.estimator.export.ServingInputReceiver
Then just save with export_savedmodel :
full example code:
Restoring
Now let's restore the model :
Here is Ipython notebook demo example with data and explanation :