I am creating neural nets with Tensorflow
and skflow
; for some reason I want to get the values of some inner tensors for a given input, so I am using myClassifier.get_layer_value(input, "tensorName")
, myClassifier
being a skflow.estimators.TensorFlowEstimator
.
However, I find it difficult to find the correct syntax of the tensor name, even knowing its name (and I'm getting confused between operation and tensors), so I'm using tensorboard to plot the graph and look for the name.
Is there a way to enumerate all the tensors in a graph without using tensorboard?
This worked for me:
tf.all_variables()
can get you the information you want.Also, this commit made today in TensorFlow Learn that provides a function
get_variable_names
in estimator that you can use to retrieve all variable names easily.I think this will do too:
But compared with Salvado and Yaroslav's answers, I don't know which one is better.
You can do
Also, if you are prototyping in an IPython notebook, you can show the graph directly in notebook, see
show_graph
function in Alexander's Deep Dream notebookThe accepted answer only gives you a list of strings with the names. I prefer a different approach, which gives you (almost) direct access to the tensors:
list_of_tuples
now contains every tensor, each within a tuple. You could also adapt it to get the tensors directly:Previous answers are good, I'd just like to share a utility function I wrote to select Tensors from a graph:
So if you have a graph with ops:
Then running
returns: