I know I can increment a variable_scope using the 'default_name' argument:
import tensorflow as tf
tf.variable_scope("A") # This is scope "A"
tf.variable_scope(None, "A") # incremented scope "A_1"
However, this no longer works when an outer context is re-entered
reuse= tf.AUTO_REUSE
with tf.variable_scope("A", reuse=reuse):
with tf.variable_scope("B", reuse=tf.AUTO_REUSE):
print tf.get_variable("x", (), tf.float32) # 'A/B/x:0'
with tf.variable_scope(None, "B"): # Increment B, as expected
print tf.get_variable("x", (), tf.float32) # 'A/B_1/x:0'
# Re-enter A and try to increment B
with tf.variable_scope("A", reuse=reuse):
with tf.variable_scope(None, "B"): # Does not increment B !!!
print tf.get_variable("x", (), tf.float32) # 'A/B/x:0' !!!
- Is there a way to increment "B" after re-entering "A" ?
- The re-entered context shares its variable with the initial context A, but not the way it increments its inner context. I find this very confusing, and wonder about the rationale.
Thank you !