diff options
Diffstat (limited to 'tensorflow/docs_src/api_guides/python/nn.md')
-rw-r--r-- | tensorflow/docs_src/api_guides/python/nn.md | 3 |
1 files changed, 2 insertions, 1 deletions
diff --git a/tensorflow/docs_src/api_guides/python/nn.md b/tensorflow/docs_src/api_guides/python/nn.md index 4f188372a0..75dbb04e7d 100644 --- a/tensorflow/docs_src/api_guides/python/nn.md +++ b/tensorflow/docs_src/api_guides/python/nn.md @@ -8,7 +8,7 @@ Note: Functions taking `Tensor` arguments can also take anything accepted by ## Activation Functions The activation ops provide different types of nonlinearities for use in neural -networks. These include smooth nonlinearities (`sigmoid`, `tanh`, `elu`, +networks. These include smooth nonlinearities (`sigmoid`, `tanh`, `elu`, `selu`, `softplus`, and `softsign`), continuous but not everywhere differentiable functions (`relu`, `relu6`, `crelu` and `relu_x`), and random regularization (`dropout`). @@ -20,6 +20,7 @@ shape as the input tensor. * @{tf.nn.relu6} * @{tf.nn.crelu} * @{tf.nn.elu} +* @{tf.nn.selu} * @{tf.nn.softplus} * @{tf.nn.softsign} * @{tf.nn.dropout} |