aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/docs_src/programmers_guide/using_tpu.md
diff options
context:
space:
mode:
Diffstat (limited to 'tensorflow/docs_src/programmers_guide/using_tpu.md')
-rw-r--r--tensorflow/docs_src/programmers_guide/using_tpu.md7
1 files changed, 4 insertions, 3 deletions
diff --git a/tensorflow/docs_src/programmers_guide/using_tpu.md b/tensorflow/docs_src/programmers_guide/using_tpu.md
index a9c2cb3e33..d74d7f3181 100644
--- a/tensorflow/docs_src/programmers_guide/using_tpu.md
+++ b/tensorflow/docs_src/programmers_guide/using_tpu.md
@@ -129,9 +129,10 @@ my_tpu_estimator = tf.contrib.tpu.TPUEstimator(
Typically the `FLAGS` would be set by command line arguments. To switch from
training locally to training on a cloud TPU you would need to:
-* Set `FLAGS.use_tpu` to `True`
-* Set `FLAGS.tpu_name` so the `tf.contrib.cluster_resolver.TPUClusterResolver` can find it
-* Set `FLAGS.model_dir` to a Google Cloud Storage bucket url (`gs://`).
+ 1) Set `FLAGS.use_tpu` to `True`
+ 1) Set `FLAGS.tpu_name` so the
+ `tf.contrib.cluster_resolver.TPUClusterResolver` can find it
+ 1) Set `FLAGS.model_dir` to a Google Cloud Storage bucket url (`gs://`).
## Optimizer