aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/contrib/model_pruning
diff options
context:
space:
mode:
authorGravatar Suyog Gupta <suyoggupta@google.com>2018-10-02 14:46:13 -0700
committerGravatar TensorFlower Gardener <gardener@tensorflow.org>2018-10-02 14:50:48 -0700
commit891e49f57b8229f58315cfeb743e38c235918083 (patch)
tree907d668e713c5d200c4b056fee5e926cdb469019 /tensorflow/contrib/model_pruning
parent05812d761031b108b43560c90867b96dc4f030eb (diff)
Add missing documentation for use_tpu hparam
PiperOrigin-RevId: 215462000
Diffstat (limited to 'tensorflow/contrib/model_pruning')
-rw-r--r--tensorflow/contrib/model_pruning/README.md1
1 files changed, 1 insertions, 0 deletions
diff --git a/tensorflow/contrib/model_pruning/README.md b/tensorflow/contrib/model_pruning/README.md
index 15d95896d9..b313024e28 100644
--- a/tensorflow/contrib/model_pruning/README.md
+++ b/tensorflow/contrib/model_pruning/README.md
@@ -62,6 +62,7 @@ The pruning library allows for specification of the following hyper parameters:
| sparsity_function_begin_step | integer | 0 | The global step at this which the gradual sparsity function begins to take effect |
| sparsity_function_end_step | integer | 100 | The global step used as the end point for the gradual sparsity function |
| sparsity_function_exponent | float | 3.0 | exponent = 1 is linearly varying sparsity between initial and final. exponent > 1 varies more slowly towards the end than the beginning |
+| use_tpu | bool | False | Training using TPUs? |
The sparsity $$s_t$$ at global step $$t$$ is given by: