aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/contrib/nn/__init__.py
diff options
context:
space:
mode:
authorGravatar A. Unique TensorFlower <gardener@tensorflow.org>2017-09-28 11:53:24 -0700
committerGravatar TensorFlower Gardener <gardener@tensorflow.org>2017-09-28 11:59:04 -0700
commit0254d0d31337724db911c89609336afd60e8192d (patch)
treefb167b3647a9b2030173387831b94eece9a59fdd /tensorflow/contrib/nn/__init__.py
parent996a85d436a0f45d5bfdaad2946cef12f70883eb (diff)
Adds tf.contrib.nn.scaled_softplus(x, alpha) = alpha * softplus(x/alpha). This can be thought of as a smoothed version of a ReLU. On Imagenet, alpha=0.3 gives 0.6-1% improvement in validation accuracy compared to ReLU, by reducing the generalization gap.
PiperOrigin-RevId: 170376244
Diffstat (limited to 'tensorflow/contrib/nn/__init__.py')
-rw-r--r--tensorflow/contrib/nn/__init__.py3
1 files changed, 2 insertions, 1 deletions
diff --git a/tensorflow/contrib/nn/__init__.py b/tensorflow/contrib/nn/__init__.py
index 2cfeaa955d..be0957f473 100644
--- a/tensorflow/contrib/nn/__init__.py
+++ b/tensorflow/contrib/nn/__init__.py
@@ -26,9 +26,10 @@ from __future__ import division
from __future__ import print_function
# pylint: disable=unused-import,wildcard-import
+from tensorflow.contrib.nn.python.ops.alpha_dropout import *
from tensorflow.contrib.nn.python.ops.cross_entropy import *
from tensorflow.contrib.nn.python.ops.sampling_ops import *
-from tensorflow.contrib.nn.python.ops.alpha_dropout import *
+from tensorflow.contrib.nn.python.ops.scaled_softplus import *
# pylint: enable=unused-import,wildcard-import
from tensorflow.python.util.all_util import remove_undocumented