aboutsummaryrefslogtreecommitdiffhomepage
diff options
context:
space:
mode:
authorGravatar ImSheridan <xiaoyudong-512@163.com>2018-03-14 06:41:37 +0800
committerGravatar Frank Chen <frankchn@gmail.com>2018-03-13 15:41:37 -0700
commitdacedd7f028134e61e57cb836060ef712b274209 (patch)
tree88beb40e2373a33a0b495b3590ba418e624aa860
parent10d5ea50db844057e445f552ba69cd7bbf9a1f23 (diff)
Fix the messed up list format in using_tpu.md (#17561)
* Fix the list format in using_tpu.md * Fix the indent before tf.contrib.cluster_resolver.TPUClusterResolver
-rw-r--r--tensorflow/docs_src/programmers_guide/using_tpu.md7
1 files changed, 3 insertions, 4 deletions
diff --git a/tensorflow/docs_src/programmers_guide/using_tpu.md b/tensorflow/docs_src/programmers_guide/using_tpu.md
index d74d7f3181..a9c2cb3e33 100644
--- a/tensorflow/docs_src/programmers_guide/using_tpu.md
+++ b/tensorflow/docs_src/programmers_guide/using_tpu.md
@@ -129,10 +129,9 @@ my_tpu_estimator = tf.contrib.tpu.TPUEstimator(
Typically the `FLAGS` would be set by command line arguments. To switch from
training locally to training on a cloud TPU you would need to:
- 1) Set `FLAGS.use_tpu` to `True`
- 1) Set `FLAGS.tpu_name` so the
- `tf.contrib.cluster_resolver.TPUClusterResolver` can find it
- 1) Set `FLAGS.model_dir` to a Google Cloud Storage bucket url (`gs://`).
+* Set `FLAGS.use_tpu` to `True`
+* Set `FLAGS.tpu_name` so the `tf.contrib.cluster_resolver.TPUClusterResolver` can find it
+* Set `FLAGS.model_dir` to a Google Cloud Storage bucket url (`gs://`).
## Optimizer