Tensorflow: What is the output node name in Cifar-

2019-08-20 07:05发布

I'm trying to understand Tensorflow and I'm seeing one of the official examples, the Cifar-10 model.

In cifar10.py, in inference(), you can see the following lines:

with tf.variable_scope('softmax_linear') as scope:
    weights = _variable_with_weight_decay('weights', [192, NUM_CLASSES],
                                      stddev=1/192.0, wd=0.0)
    biases = _variable_on_cpu('biases', [NUM_CLASSES],
                          tf.constant_initializer(0.0))
    softmax_linear = tf.add(tf.matmul(local4, weights), biases, name=scope.name)
    _activation_summary(softmax_linear)

scope.name should be softmax_linear, and that should be the node's name. I saved the graph proto with the following lines (it differs from the tutorial):

with tf.Graph().as_default():
    global_step = tf.Variable(0, trainable=False)

    # Get images and labels
    images, labels = cifar10.distorted_inputs()


    # Build a Graph that computes the logits predictions from the
    # inference model.
    logits = cifar10.inference(images)

    # Calculate loss.
    loss = cifar10.loss(logits, labels)

    # Build a Graph that trains the model with one batch of examples and
    # updates the model parameters.
    train_op = cifar10.train(loss, global_step)

    # Create a saver.
    saver = tf.train.Saver(tf.global_variables())

    # Build the summary operation based on the TF collection of Summaries.
    summary_op = tf.summary.merge_all()

    # Build an initialization operation to run below.
    init = tf.global_variables_initializer()

    # Start running operations on the Graph.
    sess = tf.Session(config=tf.ConfigProto(
        log_device_placement=FLAGS.log_device_placement))
    sess.run(init)

    # save the graph
    tf.train.write_graph(sess.graph_def, FLAGS.train_dir, 'model.pbtxt')  

    ....

But I can't see a node called softmax_linear in model.pbtxt. What am I doing wrong? I just want the name of the output node to export the graph.

1条回答
Animai°情兽
2楼-- · 2019-08-20 07:33

The operator name won't be "softmax_linear". The tf.name_scope() prefixes names of operators with its name, separated by a /. Each operator has its own name. For example, if you write

with tf.name_scope("foo"):
   a = tf.constant(1, name="bar")

then the constant will have name "foo/bar".

Hope that helps!

查看更多
登录 后发表回答