index
:
tensorflow
master
machine learning framework
about
summary
refs
log
tree
commit
diff
homepage
log msg
author
committer
range
path:
root
/
tensorflow
/
contrib
/
seq2seq
/
python
/
ops
/
attention_wrapper.py
Commit message (
Expand
)
Author
Age
*
Remove usage of magic-api-link syntax from source files.
Mark Daoust
2018-08-09
*
Merge commit for internal changes
Yifei Feng
2018-04-17
|
\
|
*
Fixes a comment in tf.contrib.seq2seq.monotonic_attention().
A. Unique TensorFlower
2018-04-17
*
|
Fix pylint issue
Yong Tang
2018-04-16
*
|
Fix the issue with Bahdanau attention when normalized=True and dtype = float1...
Yong Tang
2018-04-16
*
|
Support passing layer instances to produce attentional hidden states (#14974)
Guillaume Klein
2018-04-15
|
*
Merge changes from github.
Scott Zhu
2018-04-13
|
*
Merge changes from github.
Michael Case
2018-04-10
*
|
Fix some rendering format in contrib doc strings (#18148)
ImSheridan
2018-04-05
*
|
Fix issue with Luong attention when scale=True and dtype of tf.float16/tf.flo...
Yong Tang
2018-03-30
|
*
Automated g4 rollback of changelist 190858242
Jianwei Xie
2018-03-29
|
*
Automated g4 rollback of changelist 190835392
Anna R
2018-03-28
|
*
Merge changes from github.
Jianwei Xie
2018-03-28
*
|
Seq2seq minorsp (#18010)
brett koonce
2018-03-27
|
*
Merge changes from github.
Jacques Pienaar
2018-03-21
*
|
Support TensorArray in BeamSearchDecoder state. (#13312)
Guillaume Klein
2018-03-18
|
/
*
Improve errors raised when an object does not match the RNNCell interface.
A. Unique TensorFlower
2018-03-11
*
Merge changes from github.
Ankur Taly
2018-02-16
*
Merge changes from github.
Michael Case
2018-02-07
*
Propagate static shape info in AttentionWrapperState.clone() if possible.
Rui Zhao
2018-01-10
*
[tf.contrib.seq2seq] Modify AttentionMechanisms to propagate state.
Eugene Brevdo
2017-12-15
*
Merge changes from github.
Yifei Feng
2017-11-22
*
Automated g4 rollback of changelist 176615107
Yifei Feng
2017-11-22
*
Automated g4 rollback of changelist 176615737
Yifei Feng
2017-11-22
*
Merged commit includes the following changes:
A. Unique TensorFlower
2017-11-22
*
Merge changes from github.
Yifei Feng
2017-11-21
*
Fix tf.contrib.seq2seq._monotonic_probability_fn to use a hard sigmoid when m...
Colin Raffel
2017-11-10
*
Several minor documentation fixes.
A. Unique TensorFlower
2017-10-04
*
[tf.contrib.seq2seq] Better docstrings for AttentionWrapper and BeamSearchDec...
Eugene Brevdo
2017-09-29
*
Merge changes from github.
A. Unique TensorFlower
2017-08-15
*
Merge changes from github.
Benoit Steiner
2017-08-01
*
Add multi-head attention capabilities to AttentionWrapper via the specificati...
Adam Roberts
2017-07-19
*
[tf contrib seq2seq] Provide informative error messages in AttentionWrapper.c...
Eugene Brevdo
2017-07-18
*
Merge changes from github.
Shanqing Cai
2017-07-10
*
[tf contrib seq2seq] Add monotonic attention mechanisms
A. Unique TensorFlower
2017-06-21
*
Merge changes from github.
Jonathan Hseu
2017-06-09
*
Fix attention score padding. Use -inf as default instead of 0 for softmax.
A. Unique TensorFlower
2017-05-26
*
Move many of the "core" RNNCells and rnn functions back to TF core.
Eugene Brevdo
2017-05-22
*
AttentionWrapper: fixed naming
A. Unique TensorFlower
2017-05-19
*
Change rnn ops to check types via duck-typing, instead of a private attribute.
Adria Puigdomenech
2017-05-16
*
[tf contrib seq2seq] Updates to AttentionMechanism API
Eugene Brevdo
2017-05-12
*
Merge changes from github.
Benoit Steiner
2017-05-11
*
Fix a bunch of bad links and missing docs in contrib.
Mark Daoust
2017-05-02
*
Pass attention_wrapper's name to super.
A. Unique TensorFlower
2017-04-27
*
[tf contrib seq2seq] Remove extra "attention" variable scope str in Attention...
Eugene Brevdo
2017-04-27
*
Refactor Keras layers to rely on core TF layers.
Francois Chollet
2017-04-26
*
Check if memory_sequence_length is not None before converting tensor
A. Unique TensorFlower
2017-04-26
*
[tf contrib seq2seq] Update BeamSearchDecoder + AttentionWrapper API:
Eugene Brevdo
2017-04-26
*
Add an option to allow use context vector as attention in Attention Wrapper.
A. Unique TensorFlower
2017-04-24
*
Merge changes from github.
Shanqing Cai
2017-04-22
[next]