Use Azure custom-vision trained model with tensorf

2020-06-25 01:25发布

问题:

I've trained a model with Azure Custom Vision and downloaded the TensorFlow files for Android (see: https://docs.microsoft.com/en-au/azure/cognitive-services/custom-vision-service/export-your-model). How can I use this with tensorflow.js?

I need a model (pb file) and weights (json file). However Azure gives me a .pb and a textfile with tags.

From my research I also understand that there are also different pb files, but I can't find which type Azure Custom Vision exports.

I found the tfjs converter. This is to convert a TensorFlow SavedModel (is the *.pb file from Azure a SavedModel?) or Keras model to a web-friendly format. However I need to fill in "output_node_names" (how do I get these?). I'm also not 100% sure if my pb file for Android is equal to a "tf_saved_model".

I hope someone has a tip or a starting point.

回答1:

Just parroting what I said here to save you a click. I do hope that the option to export directly to tfjs is available soon.

These are the steps I did to get an exported TensorFlow model working for me:

  1. Replace PadV2 operations with Pad. This python function should do it. input_filepath is the path to the .pb model file and output_filepath is the full path of the updated .pb file that will be created.
import tensorflow as tf
def ReplacePadV2(input_filepath, output_filepath):
    graph_def = tf.GraphDef()
    with open(input_filepath, 'rb') as f:
        graph_def.ParseFromString(f.read())

    for node in graph_def.node:
        if node.op == 'PadV2':
            node.op = 'Pad'
            del node.input[-1]
            print("Replaced PadV2 node: {}".format(node.name))

    with open(output_filepath, 'wb') as f:
        f.write(graph_def.SerializeToString())
  1. Install tensorflowjs 0.8.6 or earlier. Converting frozen models is deprecated in later versions.
  2. When calling the convertor, set --input_format as tf_frozen_model and set output_node_names as model_outputs. This is the command I used.
tensorflowjs_converter --input_format=tf_frozen_model --output_json=true --output_node_names='model_outputs' --saved_model_tags=serve  path\to\modified\model.pb  folder\to\save\converted\output

Ideally, tf.loadGraphModel('path/to/converted/model.json') should now work (tested for tfjs 1.0.0 and above).



回答2:

Partial answer:

Trying to achieve the same thing - here is the start of an answer - to make use of the output_node_names:

tensorflowjs_converter --input_format=tf_frozen_model --output_node_names='model_outputs' model.pb web_model

I am not yet sure how to incorporate this into same code - do you have anything @Kasper Kamperman?