aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/contrib/tpu/__init__.py
diff options
context:
space:
mode:
authorGravatar Russell Power <power@google.com>2018-09-26 14:36:59 -0700
committerGravatar TensorFlower Gardener <gardener@tensorflow.org>2018-09-26 14:41:28 -0700
commita1801ecdbb75b4583d757204611afd9af28b4a49 (patch)
tree97f8e453de9d819a33bd4d54bc87278814c56fea /tensorflow/contrib/tpu/__init__.py
parent2116c6649cfe339ce8a3859eb425806db8ae32b9 (diff)
Add experimental asynchronous checkpoint hook.
This triggers checkpoints in a separate thread while allowing training to continue. This can effectively parallelize checkpointing and training for workloads like TPUEstimator, where the weights are only updated after a number of device iterations. PiperOrigin-RevId: 214670991
Diffstat (limited to 'tensorflow/contrib/tpu/__init__.py')
-rw-r--r--tensorflow/contrib/tpu/__init__.py1
1 files changed, 1 insertions, 0 deletions
diff --git a/tensorflow/contrib/tpu/__init__.py b/tensorflow/contrib/tpu/__init__.py
index 3c0456dc2f..766466968a 100644
--- a/tensorflow/contrib/tpu/__init__.py
+++ b/tensorflow/contrib/tpu/__init__.py
@@ -55,6 +55,7 @@
@@TPUDistributionStrategy
@@keras_to_tpu_model
+@@AsyncCheckpointSaverHook
"""
from __future__ import absolute_import