I'm mobile developer. And I want to use various Tensorflow Lite models(.tflite
) with MLKit.
But there are some issues, I have no idea of how to know .tflite
model's input/output feature info(these will be parameters for setup).
Is there any way to know that?
Sorry for bad English and thanks.
Update(18.06.13.):
I found this site https://lutzroeder.github.io/Netron/.
This visualize graph based on your uploaded model(like .mlmode
or .tflite
etc.) and find input/output form.
Here is example screenshot!
https://lutzroeder.github.io/Netron example
If you already have a tflite model that you did not produce yourself, and you
want to look inside the tflite file and understand your inputs and outputs, you can use flatc
tool and convert
the model to .json
file and read that.
First clone the flatbuffers repo and build flatc.
git clone https://github.com/google/flatbuffers.git
Then you have to have the tensorflow schema.fbs stored locally. Either checkout the tensorflow github or download
that one file.
Then you can run flatc
to generate the json
file from then input tflite model.
flatc -t schema.fbs -- input_model.tflite
This will create a input_model.json
file that can be easily read.
Adding to the above answer:
See these instructions for building:
https://google.github.io/flatbuffers/md__building.html
If you already have a tflite model that you did not produce yourself,
and you want to look inside the tflite file and understand your inputs
and outputs, you can use flatc tool and convert the model to .json
file and read that.
First clone the flatbuffers repo and build flatc.
git clone https://github.com/google/flatbuffers.git
Then you have to have the tensorflow schema.fbs stored locally. Either checkout the
tensorflow github or download that one file. Then you can run flatc to
generate the json file from then input tflite model.
flatc -t schema.fbs -- input_model.tflite
This will create a input_model.json file that can be easily read.