Tensorflow's tensorflow variable_scope values

2019-01-28 08:17发布

问题:

I am currently reading a source code for slim library that is based on Tensorflow and they use values argument for variable_scope method alot, like here.

From the API page I can see:

This context manager validates that the (optional) values are from the same graph, ensures that graph is the default graph, and pushes a name scope and a variable scope.

My question is: variables from values are only being checked if they are from the same graph? What are the use cases for this and why someone will need that?

回答1:

The variable_scope parameter helps ensure uniqueness of variables and reuse of variables where desired.

Yes if you create two or more different computation graphs then they wouldn't necessarily share the same variable scope; however, there are ways to get them to be shared across graphs so the option is there.

Primary use cases for variable scope are for RNN's where many of the weights are tied and reused. That's one reason someone would need it. The other main reason it's there is to ensure that you are reusing the same variables when you explicitly mean to and not by accident. (For distributed settings this can become a concern.)