How to deploy a tensorflow model to azure ml workb

2019-05-14 09:11发布

I am using Azure ML Workbench to perform binary classification. So far, everything works fine, I having good accuracy, and I would like to deploy the model as a web service for inference.

I don't really know where to start : azure provides this doc, but the example uses sklearn and pickle, not tensorflow.

I'm not even sure if I should save and restore the model with tf.train.Saver() or with tf.saved_model_builder().

If anyone has a good example that use vanilla tensorflow in azure ml workbench, that'd be great.

1条回答
干净又极端
2楼-- · 2019-05-14 09:25

Ok, so for anyone wondering the same, I found the answer. Instead of using a pickle model, I saved my model as a protobuf, by following this. Then, I write the init(), run() and load_graph() method like so :

def init():
    global persistent_session, model, x, y, keep_prob, inputs_dc, prediction_dc
    #load the model and connect the inputs / outputs
    model = load_graph(os.path.join(os.environ['AZUREML_NATIVE_SHARE_DIRECTORY'], 'frozen_model.pb'))
    x = model.get_tensor_by_name('prefix/Placeholder:0')
    y = model.get_tensor_by_name('prefix/convNet/sample_prediction:0')
    keep_prob = model.get_tensor_by_name('prefix/Placeholder_3:0')
    persistent_session = tf.Session(graph=model)

# load the graph from protobuf file
def load_graph(frozen_graph_filename):
    with tf.gfile.GFile(frozen_graph_filename, "rb") as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())

    with tf.Graph().as_default() as graph:
        tf.import_graph_def(graph_def, name="prefix")
    return graph

# run the inference
def run(input_array):
    import json
    global clcf2, inputs_dc, prediction_dc
    try:  
        prediction = persistent_session.run(y, feed_dict={ x: input_array, keep_prob:1.0})
        print("prediction : ", prediction)
        inputs_dc.collect(input_array)
        prediction_dc.collect(prediction.tolist())
        return prediction
    except Exception as e:
        return (str(e))
    return json.dumps(str(prediction.tolist()))

Probably needs some cleaning, but it works !

查看更多
登录 后发表回答