aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/contrib/nn
diff options
context:
space:
mode:
authorGravatar Mark Daoust <markdaoust@google.com>2018-08-09 07:03:39 -0700
committerGravatar TensorFlower Gardener <gardener@tensorflow.org>2018-08-09 07:08:30 -0700
commitf40a875355557483aeae60ffcf757fc9626c752b (patch)
tree7f642a6fd12495c1c7d9b2f3a37e376d8ee6d2c9 /tensorflow/contrib/nn
parentfd9fc4b4b69f7fce60497bbaf5cbd958f12ead8d (diff)
Remove usage of magic-api-link syntax from source files.
Back-ticks are now converted to links in the api_docs generator. With the new docs repo we're moving to simplify the docs pipeline, and make everything more readable. By doing this we no longer get test failures for symbols that don't exist (`tf.does_not_exist` will not get a link). There is also no way, not to set custom link text. That's okay. This is the result of the following regex replacement (+ a couple of manual edits.): re: @\{([^$].*?)(\$.+?)?} sub: `\1` Which does the following replacements: "@{tf.symbol}" --> "`tf.symbol`" "@{tf.symbol$link_text}" --> "`tf.symbol`" PiperOrigin-RevId: 208042358
Diffstat (limited to 'tensorflow/contrib/nn')
-rw-r--r--tensorflow/contrib/nn/python/ops/alpha_dropout.py2
-rw-r--r--tensorflow/contrib/nn/python/ops/sampling_ops.py10
2 files changed, 6 insertions, 6 deletions
diff --git a/tensorflow/contrib/nn/python/ops/alpha_dropout.py b/tensorflow/contrib/nn/python/ops/alpha_dropout.py
index 2f92d05ba8..98f4264fe0 100644
--- a/tensorflow/contrib/nn/python/ops/alpha_dropout.py
+++ b/tensorflow/contrib/nn/python/ops/alpha_dropout.py
@@ -43,7 +43,7 @@ def alpha_dropout(x, keep_prob, noise_shape=None, seed=None, name=None): # pylin
noise_shape: A 1-D `Tensor` of type `int32`, representing the
shape for randomly generated keep/drop flags.
seed: A Python integer. Used to create random seeds. See
- @{tf.set_random_seed} for behavior.
+ `tf.set_random_seed` for behavior.
name: A name for this operation (optional).
Returns:
diff --git a/tensorflow/contrib/nn/python/ops/sampling_ops.py b/tensorflow/contrib/nn/python/ops/sampling_ops.py
index e65925610c..de71b0845e 100644
--- a/tensorflow/contrib/nn/python/ops/sampling_ops.py
+++ b/tensorflow/contrib/nn/python/ops/sampling_ops.py
@@ -123,15 +123,15 @@ def rank_sampled_softmax_loss(weights,
"""Computes softmax loss using rank-based adaptive resampling.
This has been shown to improve rank loss after training compared to
- @{tf.nn.sampled_softmax_loss}. For a description of the algorithm and some
+ `tf.nn.sampled_softmax_loss`. For a description of the algorithm and some
experimental results, please see: [TAPAS: Two-pass Approximate Adaptive
Sampling for Softmax](https://arxiv.org/abs/1707.03073).
Sampling follows two phases:
* In the first phase, `num_sampled` classes are selected using
- @{tf.nn.learned_unigram_candidate_sampler} or supplied `sampled_values`.
+ `tf.nn.learned_unigram_candidate_sampler` or supplied `sampled_values`.
The logits are calculated on those sampled classes. This phases is
- similar to @{tf.nn.sampled_softmax_loss}.
+ similar to `tf.nn.sampled_softmax_loss`.
* In the second phase, the `num_resampled` classes with highest predicted
probability are kept. Probabilities are
`LogSumExp(logits / resampling_temperature)`, where the sum is over
@@ -142,7 +142,7 @@ def rank_sampled_softmax_loss(weights,
picks more candidates close to the predicted classes. A common strategy is
to decrease the temperature as training proceeds.
- See @{tf.nn.sampled_softmax_loss} for more documentation on sampling and
+ See `tf.nn.sampled_softmax_loss` for more documentation on sampling and
for typical default values for some of the parameters.
This operation is for training only. It is generally an underestimate of
@@ -197,7 +197,7 @@ def rank_sampled_softmax_loss(weights,
where a sampled class equals one of the target classes.
partition_strategy: A string specifying the partitioning strategy, relevant
if `len(weights) > 1`. Currently `"div"` and `"mod"` are supported.
- See @{tf.nn.embedding_lookup} for more details.
+ See `tf.nn.embedding_lookup` for more details.
name: A name for the operation (optional).
Returns: