aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/python/ops/variable_scope.py
diff options
context:
space:
mode:
Diffstat (limited to 'tensorflow/python/ops/variable_scope.py')
-rw-r--r--tensorflow/python/ops/variable_scope.py21
1 files changed, 20 insertions, 1 deletions
diff --git a/tensorflow/python/ops/variable_scope.py b/tensorflow/python/ops/variable_scope.py
index f49e2d314d..47414c28af 100644
--- a/tensorflow/python/ops/variable_scope.py
+++ b/tensorflow/python/ops/variable_scope.py
@@ -1786,6 +1786,23 @@ class variable_scope(object):
assert v.name == "foo/bar/v:0"
```
+ Simple example of how to reenter a premade variable scope safely:
+
+ ```python
+ with tf.variable_scope("foo") as vs:
+ pass
+
+ # Re-enter the variable scope.
+ with tf.variable_scope(vs,
+ auxiliary_name_scope=False) as vs1:
+ # Restore the original name_scope.
+ with tf.name_scope(vs1.original_name_scope):
+ v = tf.get_variable("v", [1])
+ assert v.name == "foo/v:0"
+ c = tf.constant([1], name="c")
+ assert c.name == "foo/c:0"
+ ```
+
Basic example of sharing a variable AUTO_REUSE:
```python
@@ -1924,7 +1941,9 @@ class variable_scope(object):
(which must have the same shape). Constraints are not safe to
use when doing asynchronous distributed training.
auxiliary_name_scope: If `True`, we create an auxiliary name scope with
- the scope. If `False`, we don't touch name scope.
+ the scope. If `False`, we don't create it. Note that the argument is
+ not inherited, and it only takes effect for once when creating. You
+ should only use it for re-entering a premade variable scope.
Returns:
A scope that can be captured and reused.