diff options
author | 2016-09-09 16:07:46 -0800 | |
---|---|---|
committer | 2016-09-09 17:19:12 -0700 | |
commit | 54a7178849a7603de919e6cdc4f404b470f87d14 (patch) | |
tree | b0fe52e6330545f45541504257ab2c9da626be2f /tensorflow/g3doc/tutorials/recurrent/index.md | |
parent | f51e1964b57ecce5f3b38d4156cf61e80e5c4181 (diff) |
Merge changes from github.
Change: 132733397
Diffstat (limited to 'tensorflow/g3doc/tutorials/recurrent/index.md')
-rw-r--r-- | tensorflow/g3doc/tutorials/recurrent/index.md | 4 |
1 files changed, 2 insertions, 2 deletions
diff --git a/tensorflow/g3doc/tutorials/recurrent/index.md b/tensorflow/g3doc/tutorials/recurrent/index.md index 82b159c20a..3ab8061981 100644 --- a/tensorflow/g3doc/tutorials/recurrent/index.md +++ b/tensorflow/g3doc/tutorials/recurrent/index.md @@ -61,7 +61,7 @@ The basic pseudocode looks as follows: lstm = rnn_cell.BasicLSTMCell(lstm_size) # Initial state of the LSTM memory. state = tf.zeros([batch_size, lstm.state_size]) - +probabilities = [] loss = 0.0 for current_batch_of_words in words_in_dataset: # The value of state is updated after processing each batch of words. @@ -69,7 +69,7 @@ for current_batch_of_words in words_in_dataset: # The LSTM output can be used to make next word predictions logits = tf.matmul(output, softmax_w) + softmax_b - probabilities = tf.nn.softmax(logits) + probabilities.append(tf.nn.softmax(logits)) loss += loss_function(probabilities, target_words) ``` |