aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/docs_src/programmers_guide/saved_model.md
diff options
context:
space:
mode:
Diffstat (limited to 'tensorflow/docs_src/programmers_guide/saved_model.md')
-rw-r--r--tensorflow/docs_src/programmers_guide/saved_model.md4
1 files changed, 2 insertions, 2 deletions
diff --git a/tensorflow/docs_src/programmers_guide/saved_model.md b/tensorflow/docs_src/programmers_guide/saved_model.md
index e169efe19c..9262143ad8 100644
--- a/tensorflow/docs_src/programmers_guide/saved_model.md
+++ b/tensorflow/docs_src/programmers_guide/saved_model.md
@@ -440,10 +440,10 @@ does not specify one.
### Serving the exported model locally
For local deployment, you can serve your model using
-@{$deploy/tfserve$Tensorflow Serving}, an open-source project that loads a
+[TensorFlow Serving](http://github.com/tensorflow/serving), an open-source project that loads a
SavedModel and exposes it as a [gRPC](http://www.grpc.io/) service.
-First, [install TensorFlow Serving](https://tensorflow.github.io/serving/setup#prerequisites).
+First, [install TensorFlow Serving](http://github.com/tensorflow/serving).
Then build and run the local model server, substituting `$export_dir_base` with
the path to the SavedModel you exported above: