aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/python/training/adadelta.py
Commit message (Expand)AuthorAge
* Allow adadelta, adagrad, adam, rmsprop, and gradient_descent optimizers take ...Gravatar A. Unique TensorFlower2018-06-11
* Adding tf_export decorators/calls to TensorFlow functions and constants.Gravatar Anna R2018-01-31
* Document learning rate parameter with regards to the original paper.Gravatar Andrew Selle2017-03-28
* Remove @@__init__ from docstrings.Gravatar Andrew Selle2017-02-13
* Backprop through resource handles.Gravatar Alexandre Passos2017-02-13
* Enables sparse optimizers for resource variables.Gravatar A. Unique TensorFlower2017-01-31
* Enables all optimizers for dense resource variables.Gravatar A. Unique TensorFlower2017-01-27
* Automated rollback of change 139400135Gravatar Jonathan Hseu2016-11-18
* Rename `Tensor` to `Output` in all Python docsGravatar Jonathan Hseu2016-11-16
* Execute TODOs toGravatar Olivia Nordquist2016-06-14
* Update copyright for 3p/tf/python.Gravatar A. Unique TensorFlower2016-06-02
* Fix link to adadelta paper in adadelta optimizer docs.Gravatar Dan Mané2016-05-23
* Merge changes from github.Gravatar A. Unique TensorFlower2016-05-05
* Enable fp16 support for all optimizers, and also add unit tests for all thatGravatar A. Unique TensorFlower2016-04-20
* Merge changes from github, some fixes to adhere somewhatGravatar Vijay Vasudevan2016-03-22