diff options
author | 2016-06-29 23:09:23 -0800 | |
---|---|---|
committer | 2016-06-30 00:18:39 -0700 | |
commit | 2cd93cab24c327c216cd238b65288d1b97d0899c (patch) | |
tree | 43f5ef4ae90802d3386c7c00671a9ad09fba87fa /tensorflow/python/training/training.py | |
parent | 631764bd0ac83b82a3fa478100ac1852e964691d (diff) |
Only the exponentially decaying learning rate seems to have been
exposed from the module: learning_rate_decay. The very useful decaying
scheme: piecewise_constant however seems to have been not exposed.
Change: 126273419
Diffstat (limited to 'tensorflow/python/training/training.py')
-rw-r--r-- | tensorflow/python/training/training.py | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/tensorflow/python/training/training.py b/tensorflow/python/training/training.py index e6ae335d94..8135a86021 100644 --- a/tensorflow/python/training/training.py +++ b/tensorflow/python/training/training.py @@ -198,7 +198,7 @@ from tensorflow.core.example.feature_pb2 import * from tensorflow.core.protobuf.saver_pb2 import * # Utility op. Open Source. TODO(touts): move to nn? -from tensorflow.python.training.learning_rate_decay import exponential_decay +from tensorflow.python.training.learning_rate_decay import * # Distributed computing support |