diff options
author | ImSheridan <xiaoyudong-512@163.com> | 2018-03-14 06:41:37 +0800 |
---|---|---|
committer | Frank Chen <frankchn@gmail.com> | 2018-03-13 15:41:37 -0700 |
commit | dacedd7f028134e61e57cb836060ef712b274209 (patch) | |
tree | 88beb40e2373a33a0b495b3590ba418e624aa860 | |
parent | 10d5ea50db844057e445f552ba69cd7bbf9a1f23 (diff) |
Fix the messed up list format in using_tpu.md (#17561)
* Fix the list format in using_tpu.md
* Fix the indent before tf.contrib.cluster_resolver.TPUClusterResolver
-rw-r--r-- | tensorflow/docs_src/programmers_guide/using_tpu.md | 7 |
1 files changed, 3 insertions, 4 deletions
diff --git a/tensorflow/docs_src/programmers_guide/using_tpu.md b/tensorflow/docs_src/programmers_guide/using_tpu.md index d74d7f3181..a9c2cb3e33 100644 --- a/tensorflow/docs_src/programmers_guide/using_tpu.md +++ b/tensorflow/docs_src/programmers_guide/using_tpu.md @@ -129,10 +129,9 @@ my_tpu_estimator = tf.contrib.tpu.TPUEstimator( Typically the `FLAGS` would be set by command line arguments. To switch from training locally to training on a cloud TPU you would need to: - 1) Set `FLAGS.use_tpu` to `True` - 1) Set `FLAGS.tpu_name` so the - `tf.contrib.cluster_resolver.TPUClusterResolver` can find it - 1) Set `FLAGS.model_dir` to a Google Cloud Storage bucket url (`gs://`). +* Set `FLAGS.use_tpu` to `True` +* Set `FLAGS.tpu_name` so the `tf.contrib.cluster_resolver.TPUClusterResolver` can find it +* Set `FLAGS.model_dir` to a Google Cloud Storage bucket url (`gs://`). ## Optimizer |