aboutsummaryrefslogtreecommitdiffhomepage
diff options
context:
space:
mode:
authorGravatar A. Unique TensorFlower <nobody@tensorflow.org>2016-04-04 16:09:52 -0800
committerGravatar TensorFlower Gardener <gardener@tensorflow.org>2016-04-04 17:14:12 -0700
commit24fa7d3a9785c31214fa012e2de4e2ab1d62afe1 (patch)
tree6f8704e30a2e63072574bd1c91f94aa2e0b9d47d
parent6f8edf28ff5d03afb7463e84b185c783c254ea5a (diff)
Update generated Python Op docs.
Change: 118998430
-rw-r--r--tensorflow/g3doc/api_docs/python/contrib.layers.md8
1 files changed, 7 insertions, 1 deletions
diff --git a/tensorflow/g3doc/api_docs/python/contrib.layers.md b/tensorflow/g3doc/api_docs/python/contrib.layers.md
index c0d6250ab1..b5c5a60960 100644
--- a/tensorflow/g3doc/api_docs/python/contrib.layers.md
+++ b/tensorflow/g3doc/api_docs/python/contrib.layers.md
@@ -447,7 +447,13 @@ Given loss and parameters for optimizer, returns a training op.
* <b>`loss`</b>: Tensor, 0 dimensional.
* <b>`global_step`</b>: Tensor, step counter for each update.
* <b>`learning_rate`</b>: float or Tensor, magnitude of update per each training step.
-* <b>`optimizer`</b>: string or function, used as optimizer for training.
+* <b>`optimizer`</b>: string, class or optimizer instance, used as trainer.
+ string should be name of optimizer, like 'SGD',
+ 'Adam', 'Adagrad'. Full list in OPTIMIZER_CLS_NAMES constant.
+ class should be sub-class of tf.Optimizer that implements
+ `compute_gradients` and `apply_gradients` functions.
+ optimizer instance should be instantion of tf.Optimizer sub-class
+ and have `compute_gradients` and `apply_gradients` functions.
* <b>`clip_gradients`</b>: float or None, clips gradients by this value.
* <b>`moving_average_decay`</b>: float or None, takes into account previous loss
to make learning smoother due to outliers.