diff options
author | Allen Lavoie <allenl@google.com> | 2018-09-28 09:27:29 -0700 |
---|---|---|
committer | TensorFlower Gardener <gardener@tensorflow.org> | 2018-09-28 09:33:20 -0700 |
commit | 4eb53d3e5f7bec3c757a06d186ff31fe52083e6d (patch) | |
tree | b3844674c71f21e7a79ec014df9e395a80507400 /tensorflow/go | |
parent | f4014108a310928cd897085a8bc7d757c641a1c3 (diff) |
Simplify eager/graph Layer.losses conditionals
Fixes an issue where losses created while executing eagerly were returned as unevaluated lambdas in a defun.
Lazily evaluates Layer losses by default when possible. Even when graph building this is generally a better thing to do (e.g. losses called in a while_loop).
Allows calls to Layer.add_loss when executing eagerly, but only for losses which are not conditional on inputs (no activity regularizers).
PiperOrigin-RevId: 214947108
Diffstat (limited to 'tensorflow/go')
0 files changed, 0 insertions, 0 deletions