aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/python/ops/losses
Commit message (Expand)AuthorAge
* BEGIN_PUBLICGravatar Alexandre Passos2018-09-24
* Merge pull request #21798 from facaiy:ENH/div_no_nan_treate_negative_as_zeroGravatar TensorFlower Gardener2018-09-24
|\
* | Move from deprecated self.test_session() to self.cached_session().Gravatar A. Unique TensorFlower2018-09-21
| * CLN: remove unnecessary math_ops.maximumGravatar Yan Facai (颜发才)2018-09-12
| * CLN: fix merge errorGravatar Yan Facai (颜发才)2018-09-12
| * CLN: remove negative_to_zero argumentGravatar Yan Facai (颜发才)2018-08-23
| * CLN: replace safe_div method by div_no_nanGravatar Yan Facai (颜发才)2018-08-22
|/
* Fix typo: compatbility => compatibilityGravatar Aurelien Geron2018-08-12
* Remove usage of magic-api-link syntax from source files.Gravatar Mark Daoust2018-08-09
* Update docstring of sparse_softmax_cross_entropy (#20242)Gravatar Yong Tang2018-06-27
* Don't add to the global losses collection from tf.losses.* when executing eag...Gravatar Allen Lavoie2018-06-18
* Improves documentation of labels and logits arguments in hinge loss methods .Gravatar Petros Mol2018-05-21
* Removing @@ comments from core TensorFlow. They are no longer needed for expo...Gravatar Anna R2018-04-26
* Removing remove_undocumented calls from tensorflow/python.Gravatar Anna R2018-04-25
* Merge changes from github.Gravatar Yifei Feng2018-04-23
* Remove all_opensource_files. It's not needed any more.Gravatar Martin Wicke2018-03-28
* Save the last loss reduction method (for future use).Gravatar A. Unique TensorFlower2018-03-26
* Merge changes from github.Gravatar Shanqing Cai2018-03-12
* eager: Rename in_eager_mode to executing_eagerly and get rid of in_graph_mode.Gravatar Asim Shankar2018-03-07
* Benchmark regressionGravatar Alexandre Passos2018-03-05
* Merge changes from github.Gravatar Yifei Feng2018-02-22
* Fast-path for losses code.Gravatar Alexandre Passos2018-02-21
* Merge changes from github.Gravatar Ankur Taly2018-02-16
* Use x*x instead of x^2 to calculate square in huber loss implementation. The ...Gravatar Yuefeng Zhou2018-02-09
* Merge changes from github.Gravatar Michael Case2018-02-07
* Fixes a type conversion bug in losses.compute_weighted_loss for reduction=SUM...Gravatar A. Unique TensorFlower2018-02-01
* Adding tf_export decorators/calls to TensorFlow functions and constants.Gravatar Anna R2018-01-31
* Merge changes from github.Gravatar Jianwei Xie2018-01-24
* Adds SUM_OVER_BATCH_SIZE in losses.Reduction.Gravatar A. Unique TensorFlower2018-01-18
* Merge changes from github.Gravatar Patrick Nguyen2017-12-28
* Merge changes from github.Gravatar Shanqing Cai2017-12-06
* Merge changes from github.Gravatar Benoit Steiner2017-10-24
* Several minor documentation fixes.Gravatar A. Unique TensorFlower2017-10-04
* Discard some unneccessary logging commands.Gravatar A. Unique TensorFlower2017-09-12
* Adds assertions to loss functions to clarify error messages.Gravatar A. Unique TensorFlower2017-08-25
* Check that weights are part of the correct graph in `hinge_loss`.Gravatar A. Unique TensorFlower2017-08-14
* BUILD dependency cleanups.Gravatar Peter Hawkins2017-08-07
* BUILD cleanupGravatar A. Unique TensorFlower2017-08-05
* Use more efficient squared_differenceGravatar Sergio Guadarrama2017-06-21
* Fix documentation on weights of tf.losses.softmax_cross_entropy.Gravatar A. Unique TensorFlower2017-05-26
* Fix `python/ops/losses/util.py` docstrings.Gravatar A. Unique TensorFlower2017-05-23
* Checks the ndims of weights before indexing in the sparse_softmax_cross_entropyGravatar Jianwei Xie2017-05-16
* Fix losses documentation.Gravatar Andrew Selle2017-05-02
* Fix losses.get_regularization_losses documentation to avoid implying losses a...Gravatar A. Unique TensorFlower2017-04-28
* Add unreduced NONE, and reduced MEAN options for losses.Gravatar A. Unique TensorFlower2017-04-20
* Add Huber Loss to tf.lossesGravatar Sergio Guadarrama2017-04-17
* Add option to return loss as batch sum, or divided by number of non-zero weig...Gravatar A. Unique TensorFlower2017-04-10
* Fix the open source test.Gravatar Jianwei Xie2017-02-24
* Return `0.0` if `get_regularization_loss` is called with no losses defined.Gravatar A. Unique TensorFlower2017-02-23
* Add convenience function to get total regularization loss.Gravatar A. Unique TensorFlower2017-02-21