aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/contrib/tensorrt/README.md
diff options
context:
space:
mode:
Diffstat (limited to 'tensorflow/contrib/tensorrt/README.md')
-rw-r--r--tensorflow/contrib/tensorrt/README.md23
1 files changed, 4 insertions, 19 deletions
diff --git a/tensorflow/contrib/tensorrt/README.md b/tensorflow/contrib/tensorrt/README.md
index dfcce0fd00..461e627e99 100644
--- a/tensorflow/contrib/tensorrt/README.md
+++ b/tensorflow/contrib/tensorrt/README.md
@@ -2,7 +2,8 @@ Using TensorRT in TensorFlow
============================
This module provides necessary bindings and introduces TRT_engine_op
-operator that wraps a subgraph in TensorRT.
+operator that wraps a subgraph in TensorRT. This is still a work in progress
+but should be useable with most common graphs.
Compilation
-----------
@@ -15,26 +16,10 @@ configure script should find the necessary components from the system
automatically. If installed from tar packages, user has to set path to
location where the library is installed during configuration.
-
-```
+```shell
bazel build --config=cuda --config=opt //tensorflow/tools/pip_package:build_pip_package
bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/
```
After the installation of tensorflow package, TensorRT transformation
-will be available. An example use is shown below.
-
-```python
-import tensorflow as tf
-import tensorflow.contrib.tensorrt as trt
-#... create and train or load model
-gdef = sess.graph.as_graph_def()
-trt_gdef = trt.create_inference_graph(
- gdef, #original graph_def
- ["output"], #name of output node(s)
- max_batch_size, #maximum batch size to run the inference
- max_workspace_size_bytes) # max memory for TensorRT to use
-tf.reset_default_graph()
-tf.import_graph_def(graph_def=trt_gdef)
-#...... run inference
-```
+will be available. An example use can be found in test/test_tftrt.py directory