| Commit message (Collapse) | Author | Age |
|
|
|
| |
PiperOrigin-RevId: 213327709
|
|
|
|
|
|
| |
https://arxiv.org/pdf/1609.08144.pdf).
PiperOrigin-RevId: 212683753
|
|
|
|
|
|
|
|
| |
self.test_session() has been deprecated in 9962eb5e84b15e309410071b06c2ed2d6148ed44 as its name confuses readers of the test. Moving to cached_session() instead which is more explicit about:
* the fact that the session may be reused.
* the session is not closed even when doing a "with self.test_session()" statement.
PiperOrigin-RevId: 209701635
|
|
|
|
|
|
|
|
|
|
| |
This change contains no code changes. Only doc-strings.
We can't use relative links in code files, so we don't have much choice but to link to tensorflow.org/
The deleted links were to docs that no longer exist.
PiperOrigin-RevId: 209019572
|
|
|
|
| |
PiperOrigin-RevId: 208293192
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Back-ticks are now converted to links in the api_docs generator. With the new docs repo we're moving to simplify the docs pipeline, and make everything more readable.
By doing this we no longer get test failures for symbols that don't exist (`tf.does_not_exist` will not get a link).
There is also no way, not to set custom link text. That's okay.
This is the result of the following regex replacement (+ a couple of manual edits.):
re: @\{([^$].*?)(\$.+?)?}
sub: `\1`
Which does the following replacements:
"@{tf.symbol}" --> "`tf.symbol`"
"@{tf.symbol$link_text}" --> "`tf.symbol`"
PiperOrigin-RevId: 208042358
|
|
|
|
|
|
|
|
|
|
|
|
| |
The `sequence_length` argument that is passed to the function is the
lengths of the **reordered** predictions and was incorrectly used to
mask beam ids *before* reordering. Instead, we can reorder beam ids
without caring about out of range steps and only select the reodered
ids that are in bounds.
The added test covers a beam trajectory that previously produced an
out of range error because `gather_tree` returned `end_token` (here
`beam_width + 1`) for some steps.
|
|\
| |
| | |
Update beam_search_decoder.py
|
| |
| |
| |
| |
| |
| | |
could be updating the weights. This is specifically true on TPUS (tpu.repeat). Also, fix the `testDynamicRnnTrainLoop` unit test.
PiperOrigin-RevId: 202565323
|
| | |
|
| |
| |
| |
| | |
PiperOrigin-RevId: 194031845
|
| |\
| |/
|/| |
|
| |
| |
| |
| | |
PiperOrigin-RevId: 193448139
|
| |\
| |/
|/| |
|
| |
| |
| |
| | |
PiperOrigin-RevId: 193224285
|
| |
| |
| |
| |
| |
| | |
the user supplies it and if the inputs were created in an XLA context.
PiperOrigin-RevId: 193097293
|
| |
| |
| |
| | |
Signed-off-by: Yong Tang <yong.tang.github@outlook.com>
|
| |
| |
| |
| | |
Signed-off-by: Yong Tang <yong.tang.github@outlook.com>
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
float16/32
While revisiting 18016 I noticed that Bahdanau attention has a similiar
dtype mismatch issue when normalized=True. The issue comes from:
```
g = variable_scope.get_variable(
"attention_g", dtype=dtype,
initializer=math.sqrt((1. / num_units)))
```
where the initializer value does not work well with differnt dtype.
This fix converts changes the initializer to `init_ops.constant_initializer`
to address the issue, and adds additional test cases for it.
Signed-off-by: Yong Tang <yong.tang.github@outlook.com>
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
* Support passing Layer instances to the AttentionWrapper.
* Use _compute_output_shape to get the attention layer depth
* compute_output_shape is now a public method
* Move new argument at the end
|
| |
| |
| |
| | |
PiperOrigin-RevId: 192850372
|
| |
| |
| |
| | |
PiperOrigin-RevId: 192388250
|
| |
| |
| |
| |
| |
| | |
* Fix some rendering format in contrib doc strings
* Fix line too long pylint error
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
tf.float16/tf.float64 (#18106)
* Fix issue with Luong attention when scale=True and dtype=tf.float16/tf.float64
This fix tries to address the issue raised in 18099 where
Luong throws a ValueError when scale=True and dtype is not tf.float32.
This fix addresses the issue with the additional test case added.
This fix fixes 18099.
Signed-off-by: Yong Tang <yong.tang.github@outlook.com>
* Fix pylint issue
Signed-off-by: Yong Tang <yong.tang.github@outlook.com>
* Add test case for Luong attention with scale=True and dtype=float16/float64
Signed-off-by: Yong Tang <yong.tang.github@outlook.com>
* Add assertEqual to confirm the dtypes of the output
Signed-off-by: Yong Tang <yong.tang.github@outlook.com>
|
|
|
|
| |
PiperOrigin-RevId: 190953197
|
|
|
|
| |
PiperOrigin-RevId: 190878279
|
|
|
|
| |
PiperOrigin-RevId: 190858242
|
|
|
|
| |
PiperOrigin-RevId: 190835392
|
|
|
|
| |
PiperOrigin-RevId: 190537320
|
|
|
|
|
|
|
|
| |
extended BasicDecoder which for example returns a tf.contrib.seq2seq.AttentionWrapperState.
In this case the internal while-loop fails when trying to store an instance tf.contrib.seq2seq.AttentionWrapperState in the internal TensorArray.
PiperOrigin-RevId: 190491787
|
|
|
|
| |
PiperOrigin-RevId: 189945839
|
|
|
|
| |
PiperOrigin-RevId: 189258641
|
|
|
|
| |
PiperOrigin-RevId: 189231636
|
|
|
|
| |
PiperOrigin-RevId: 188817194
|
|
|
|
| |
PiperOrigin-RevId: 188698275
|
|
|
|
| |
PiperOrigin-RevId: 188651070
|
|
|
|
| |
PiperOrigin-RevId: 186674197
|
|
|
|
| |
PiperOrigin-RevId: 186073337
|
|
|
|
| |
PiperOrigin-RevId: 184897758
|
|
|
|
| |
PiperOrigin-RevId: 183701716
|
|
|
|
| |
PiperOrigin-RevId: 183438398
|
|
|
|
| |
PiperOrigin-RevId: 183321394
|
|
|
|
|
|
| |
They don't make sense in the open source repository.
PiperOrigin-RevId: 183140889
|
|
|
|
| |
PiperOrigin-RevId: 183115307
|
|
|
|
| |
PiperOrigin-RevId: 183100142
|
|
|
|
|
|
| |
Fixes #15737
PiperOrigin-RevId: 181523430
|
|
|
|
| |
PiperOrigin-RevId: 180981378
|
|
|
|
|
|
|
|
|
| |
Motivations:
- Useful for computing the shape of a layer's output without calling the layer.
- It is public in standalone keras (hence API discrepancy, which is something to be avoided).
- With eager mode and deferred mode for Network building, it is going to be increasingly necessary for users to implement this method or call it.
- Lots of internal users are apparently already relying on it, which highlights the importance of making this feature publicly available.
PiperOrigin-RevId: 180854139
|
|
|
|
| |
PiperOrigin-RevId: 180820115
|
|
|
|
| |
PiperOrigin-RevId: 180301735
|