Suppose we have two TensorFlow computation graphs, G1
and G2
, with saved weights W1
and W2
. Assume we build a new graph G
simply by constructing G1
and G2
. How can we restore both W1
and W2
for this new graph G
?
For a simple example:
import tensorflow as tf
V1 = tf.Variable(tf.zeros([1]))
saver_1 = tf.train.Saver()
V2 = tf.Variable(tf.zeros([1]))
saver_2 = tf.train.Saver()
sess = tf.Session()
saver_1.restore(sess, 'W1')
saver_2.restore(sess, 'W2')
In this example, saver_1
succesfully restores the corresponding V1
, but saver_2
fails with a NotFoundError
.
You can probably use two savers where each saver looks for just one of the variables. If you just use
tf.train.Saver()
, I think it will look for all variables you have defined. You can give it a list of variables to look for by usingtf.train.Saver([v1, ...])
. For more info, you can read about thetf.train.Saver
constructor here: https://www.tensorflow.org/versions/r0.11/api_docs/python/state_ops.html#SaverHere's a simple working example. Suppose you do your computation in a file "save_vars.py" and it has the following code:
If you ensure that you have a
tmp
directory and runpython save_vars.py
, you'll get the saved checkpoint files.Now, you can restore using a file named "restore_vars.py" with the following code:
and when you run
python restore_vars.py
, the output should be(at least on my computer that's the output). Feel free to post a comment if anything was unclear.