aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/g3doc/how_tos/variables/index.md
diff options
context:
space:
mode:
authorGravatar Vijay Vasudevan <vrv@google.com>2015-11-18 10:47:35 -0800
committerGravatar Vijay Vasudevan <vrv@google.com>2015-11-18 10:47:35 -0800
commitab34d55ce7618e52069a2e1c9e51aac5a1ea81c3 (patch)
tree9c79427b45ff6501e8374ceb7b4fc3bdb2828e15 /tensorflow/g3doc/how_tos/variables/index.md
parent9eb88d56ab6a9a361662d73a258593d8fbf10b62 (diff)
TensorFlow: more features, performance improvements, and doc fixes.
Changes: - Add Split/Concat() methods to TensorUtil (meant for convenience, not speed) by Chris. - Changes to linear algebra ops interface by Rasmus - Tests for tensorboard by Daniel - Fix bug in histogram calculation by Cassandra - Added tool for backwards compatibility of OpDefs. Tool Checks in history of opdefs and their changes, checks for backwards-incompatible changes. All done by @josh11b - Fix some protobuf example proto docs by Oliver - Add derivative of MatrixDeterminant by @yaroslavvb - Add a priority queue queue by @ebrevdo - Doc and typo fixes by Aurelien and @dave-andersen - Speed improvements to ConvBackwardFilter by @andydavis - Improve speed of Alexnet on TitanX by @zheng-xq - Add some host memory annotations to some GPU kernels by Yuan. - Add support for doubles in histogram summary by @jmchen-g Base CL: 108158338
Diffstat (limited to 'tensorflow/g3doc/how_tos/variables/index.md')
-rw-r--r--tensorflow/g3doc/how_tos/variables/index.md20
1 files changed, 10 insertions, 10 deletions
diff --git a/tensorflow/g3doc/how_tos/variables/index.md b/tensorflow/g3doc/how_tos/variables/index.md
index 9ff47b121a..6c3bce6c0b 100644
--- a/tensorflow/g3doc/how_tos/variables/index.md
+++ b/tensorflow/g3doc/how_tos/variables/index.md
@@ -1,4 +1,4 @@
-# Variables: Creation, Initialization, Saving, and Loading <a class="md-anchor" id="AUTOGENERATED-variables--creation--initialization--saving--and-loading"></a>
+# Variables: Creation, Initialization, Saving, and Loading
When you train a model, you use [variables](../../api_docs/python/state_ops.md)
to hold and update parameters. Variables are in-memory buffers containing
@@ -13,7 +13,7 @@ their reference manual for a complete description of their API:
* The [`tf.train.Saver`](../../api_docs/python/state_ops.md#Saver) class.
-## Creation <a class="md-anchor" id="AUTOGENERATED-creation"></a>
+## Creation
When you create a [Variable](../../api_docs/python/state_ops.md) you pass a
`Tensor` as its initial value to the `Variable()` constructor. TensorFlow
@@ -43,7 +43,7 @@ Calling `tf.Variable()` adds several ops to the graph:
The value returned by `tf.Variable()` value is an instance of the Python class
`tf.Variable`.
-## Initialization <a class="md-anchor" id="AUTOGENERATED-initialization"></a>
+## Initialization
Variable initializers must be run explicitly before other ops in your model can
be run. The easiest way to do that is to add an op that runs all the variable
@@ -74,7 +74,7 @@ with tf.Session() as sess:
...
```
-### Initialization from another Variable <a class="md-anchor" id="AUTOGENERATED-initialization-from-another-variable"></a>
+### Initialization from another Variable
You sometimes need to initialize a variable from the initial value of another
variable. As the op added by `tf.initialize_all_variables()` initializes all
@@ -96,7 +96,7 @@ w2 = tf.Variable(weights.initialized_value(), name="w2")
w_twice = tf.Variable(weights.initialized_value() * 2.0, name="w_twice")
```
-### Custom Initialization <a class="md-anchor" id="AUTOGENERATED-custom-initialization"></a>
+### Custom Initialization
The convenience function `tf.initialize_all_variables()` adds an op to
initialize *all variables* in the model. You can also pass it an explicit list
@@ -104,7 +104,7 @@ of variables to initialize. See the
[Variables Documentation](../../api_docs/python/state_ops.md) for more options,
including checking if variables are initialized.
-## Saving and Restoring <a class="md-anchor" id="AUTOGENERATED-saving-and-restoring"></a>
+## Saving and Restoring
The easiest way to save and restore a model is to use a `tf.train.Saver` object.
The constructor adds `save` and `restore` ops to the graph for all, or a
@@ -112,7 +112,7 @@ specified list, of the variables in the graph. The saver object provides
methods to run these ops, specifying paths for the checkpoint files to write to
or read from.
-### Checkpoint Files <a class="md-anchor" id="AUTOGENERATED-checkpoint-files"></a>
+### Checkpoint Files
Variables are saved in binary files that, roughly, contain a map from variable
names to tensor values.
@@ -122,7 +122,7 @@ variables in the checkpoint files. By default, it uses the value of the
[`Variable.name`](../../api_docs/python/state_ops.md#Variable.name) property for
each variable.
-### Saving Variables <a class="md-anchor" id="AUTOGENERATED-saving-variables"></a>
+### Saving Variables
Create a `Saver` with `tf.train.Saver()` to manage all variables in
the model.
@@ -149,7 +149,7 @@ with tf.Session() as sess:
print "Model saved in file: ", save_path
```
-### Restoring Variables <a class="md-anchor" id="AUTOGENERATED-restoring-variables"></a>
+### Restoring Variables
The same `Saver` object is used to restore variables. Note that when you
restore variables from a file you do not have to initialize them beforehand.
@@ -172,7 +172,7 @@ with tf.Session() as sess:
...
```
-### Choosing which Variables to Save and Restore <a class="md-anchor" id="AUTOGENERATED-choosing-which-variables-to-save-and-restore"></a>
+### Choosing which Variables to Save and Restore
If you do not pass any argument to `tf.train.Saver()` the saver handles all
variables in the graph. Each one of them is saved under the name that was