In Tensorflow, my model is based on a pre-trained model, and I added a few more variables and remove some in the pre-trained model. When I restore the variables from the checkpoint file, I have to explicitly specify all variables I added to the graph that need to be excluded. For example, I did
exclude = # explicitly list all variables to exclude
variables_to_restore = slim.get_variables_to_restore(exclude=exclude)
saver = tf.train.Saver(variables_to_restore)
Is there a simpler way to do this? Namely, as long as a variable is not in checkpoint, then don't try to restore.
The only thing that you can do is firstly having the same model as in the checkpoint, secondly restoring the checkpoint values to the same model. After restoring the variables for the same model, you can add new layers, delete existing layers or change the weights of the layers.
But there is an important point that you need to be careful. After added new layers you need to initialize them. If you use
tf.global_variables_initializer()
, you will lose the values of reloaded layers. So you should only initialize the uninitialized weights, you can use following function for this.This is more full answer, that works for not-distributed setting:
You should first find out all those variable that are useful(meaning also in your graph) and then add the joint set of the intersection of the two from the checkpoint rather than all from it.
then restore it after defining a saver like this: