aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/docs_src/programmers_guide/saved_model.md
diff options
context:
space:
mode:
Diffstat (limited to 'tensorflow/docs_src/programmers_guide/saved_model.md')
-rw-r--r--tensorflow/docs_src/programmers_guide/saved_model.md6
1 files changed, 3 insertions, 3 deletions
diff --git a/tensorflow/docs_src/programmers_guide/saved_model.md b/tensorflow/docs_src/programmers_guide/saved_model.md
index fd55731d8e..fa7a94cc06 100644
--- a/tensorflow/docs_src/programmers_guide/saved_model.md
+++ b/tensorflow/docs_src/programmers_guide/saved_model.md
@@ -479,10 +479,10 @@ does not specify one.
### Serving the exported model locally
For local deployment, you can serve your model using
-[TensorFlow Serving](http://github.com/tensorflow/serving), an open-source project that loads a
-SavedModel and exposes it as a [gRPC](http://www.grpc.io/) service.
+[TensorFlow Serving](https://github.com/tensorflow/serving), an open-source project that loads a
+SavedModel and exposes it as a [gRPC](https://www.grpc.io/) service.
-First, [install TensorFlow Serving](http://github.com/tensorflow/serving).
+First, [install TensorFlow Serving](https://github.com/tensorflow/serving).
Then build and run the local model server, substituting `$export_dir_base` with
the path to the SavedModel you exported above: