aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/contrib/model_pruning
diff options
context:
space:
mode:
authorGravatar A. Unique TensorFlower <gardener@tensorflow.org>2017-11-07 13:59:09 -0800
committerGravatar TensorFlower Gardener <gardener@tensorflow.org>2017-11-07 14:02:28 -0800
commit7183348f3270b7f9c1b333970e4f9abf6b3c4d8a (patch)
treeb753115ed58524bc9ff3b49384722c8c8bca6d6f /tensorflow/contrib/model_pruning
parent6c9818aa00755df7bb995f0a47f8600a0202ae29 (diff)
Fix documentation for contrib/model_pruning
PiperOrigin-RevId: 174907982
Diffstat (limited to 'tensorflow/contrib/model_pruning')
-rw-r--r--tensorflow/contrib/model_pruning/README.md95
1 files changed, 15 insertions, 80 deletions
diff --git a/tensorflow/contrib/model_pruning/README.md b/tensorflow/contrib/model_pruning/README.md
index a8427e6014..764e126e0d 100644
--- a/tensorflow/contrib/model_pruning/README.md
+++ b/tensorflow/contrib/model_pruning/README.md
@@ -20,7 +20,7 @@ conv = tf.nn.conv2d(images, pruning.apply_mask(weights), stride, padding)
This creates a convolutional layer with additional variables mask and threshold
as shown below: ![Convolutional layer with mask and
-threshold](./mask.png "Convolutional layer with mask and threshold")
+threshold](https://storage.googleapis.com/download.tensorflow.org/example_images/mask.png "Convolutional layer with mask and threshold")
Alternatively, the API also provides variant of tensorflow layers with these
auxiliary variables built-in (see
@@ -37,82 +37,20 @@ auxiliary variables built-in (see
The pruning library allows for specification of the following hyper parameters:
-| Hyperparameter | Type | Default | Description |
-| ---------------------------- | ------- | ------------- | -------------- |
-| name | string | model_pruning | Name of the |
-: : : : pruning :
-: : : : specification. :
-: : : : Used for :
-: : : : adding :
-: : : : summaries and :
-: : : : ops under a :
-: : : : common :
-: : : : tensorflow :
-: : : : name_scope :
-| begin_pruning_step | integer | 0 | The global |
-: : : : step at which :
-: : : : to begin :
-: : : : pruning :
-| end_pruning_step | integer | -1 | The global |
-: : : : step at which :
-: : : : to terminate :
-: : : : pruning. :
-: : : : Defaults to -1 :
-: : : : implying that :
-: : : : pruning :
-: : : : continues till :
-: : : : the training :
-: : : : stops :
-| do_not_prune | list of | [""] | list of layers |
-: : strings : : that are not :
-: : : : pruned :
-| threshold_decay | float | 0.9 | The decay |
-: : : : factor to use :
-: : : : for :
-: : : : exponential :
-: : : : decay of the :
-: : : : thresholds :
-| pruning_frequency | integer | 10 | How often |
-: : : : should the :
-: : : : masks be :
-: : : : updated? (in # :
-: : : : of :
-: : : : global_steps). :
-| nbins | integer | 255 | Number of bins |
-: : : : to use for :
-: : : : histogram :
-: : : : computation :
-| initial_sparsity | float | 0.0 | Initial |
-: : : : sparsity value :
-| target_sparsity | float | 0.5 | Target |
-: : : : sparsity value :
-| sparsity_function_begin_step | integer | 0 | The global |
-: : : : step at this :
-: : : : which the :
-: : : : gradual :
-: : : : sparsity :
-: : : : function :
-: : : : begins to take :
-: : : : effect :
-| sparsity_function_end_step | integer | 100 | The global |
-: : : : step used as :
-: : : : the end point :
-: : : : for the :
-: : : : gradual :
-: : : : sparsity :
-: : : : function :
-| sparsity_function_exponent | float | 3.0 | exponent = 1 |
-: : : : is linearly :
-: : : : varying :
-: : : : sparsity :
-: : : : between :
-: : : : initial and :
-: : : : final. :
-: : : : exponent > 1 :
-: : : : varies more :
-: : : : slowly towards :
-: : : : the end than :
-: : : : the beginning :
+|Hyperparameter | Type | Default | Description |
+|:----------------------------|:-------:|:-------------:|:--------------|
+| name | string | model_pruning | Name of the pruning specification. Used for adding summaries and ops under a common tensorflow name_scope |
+| begin_pruning_step | integer | 0 | The global step at which to begin pruning |
+| end_pruning_step | integer | -1 | The global step at which to terminate pruning. Defaults to -1 implying that pruning continues till the training stops |
+| do_not_prune | list of strings | [""] | list of layers strings that are not pruned |
+| threshold_decay | float | 0.9 | The decay factor to use for exponential decay of the thresholds |
+| pruning_frequency | integer | 10 | How often should the masks be updated? (in # of global_steps) |
+| nbins | integer | 255 | Number of bins to use for histogram computation |
+| initial_sparsity | float | 0.0 | Initial sparsity value |
+| target_sparsity | float | 0.5 | Target sparsity value |
+| sparsity_function_begin_step | integer | 0 | The global step at this which the gradual sparsity function begins to take effect |
+| sparsity_function_end_step | integer | 100 | The global step used as the end point for the gradual sparsity function |
+| sparsity_function_exponent | float | 3.0 | exponent = 1 is linearly varying sparsity between initial and final. exponent > 1 varies more slowly towards the end than the beginning |
The sparsity $$s_t$$ at global step $$t$$ is given by:
@@ -190,6 +128,3 @@ Eval:
```shell
$ bazel-bin/$examples_dir/cifar10/cifar10_eval --run_once
```
-
-TODO(suyoggupta): Add figures showing the sparsity function, sparsity for
-different layers etc.