aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/contrib/layers/python/layers/optimizers.py
diff options
context:
space:
mode:
Diffstat (limited to 'tensorflow/contrib/layers/python/layers/optimizers.py')
-rw-r--r--tensorflow/contrib/layers/python/layers/optimizers.py5
1 files changed, 3 insertions, 2 deletions
diff --git a/tensorflow/contrib/layers/python/layers/optimizers.py b/tensorflow/contrib/layers/python/layers/optimizers.py
index ac217f043f..7eb410b4c7 100644
--- a/tensorflow/contrib/layers/python/layers/optimizers.py
+++ b/tensorflow/contrib/layers/python/layers/optimizers.py
@@ -129,8 +129,9 @@ def optimize_loss(loss,
`None` to use all trainable variables.
name: The name for this operation is used to scope operations and summaries.
summaries: List of internal quantities to visualize on tensorboard. If not
- set only the loss and the learning rate will be reported. The
- complete list is in OPTIMIZER_SUMMARIES.
+ set, the loss, the learning rate, and the global norm of the
+ gradients will be reported. The complete list of possible values
+ is in OPTIMIZER_SUMMARIES.
colocate_gradients_with_ops: If True, try colocating gradients with the
corresponding op.
increment_global_step: Whether to increment `global_step`. If your model