aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/docs_src/get_started/mnist/beginners.md
diff options
context:
space:
mode:
Diffstat (limited to 'tensorflow/docs_src/get_started/mnist/beginners.md')
-rw-r--r--tensorflow/docs_src/get_started/mnist/beginners.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/tensorflow/docs_src/get_started/mnist/beginners.md b/tensorflow/docs_src/get_started/mnist/beginners.md
index 175de2be76..624d916474 100644
--- a/tensorflow/docs_src/get_started/mnist/beginners.md
+++ b/tensorflow/docs_src/get_started/mnist/beginners.md
@@ -367,7 +367,7 @@ train_step = tf.train.GradientDescentOptimizer(0.05).minimize(cross_entropy)
In this case, we ask TensorFlow to minimize `cross_entropy` using the
[gradient descent algorithm](https://en.wikipedia.org/wiki/Gradient_descent)
-with a learning rate of 0.05. Gradient descent is a simple procedure, where
+with a learning rate of 0.5. Gradient descent is a simple procedure, where
TensorFlow simply shifts each variable a little bit in the direction that
reduces the cost. But TensorFlow also provides
@{$python/train#Optimizers$many other optimization algorithms}: