Launch Tensorflow Using the NVIDIA TensorRT Output Engine

I would like to use NVIDIA TensorRT to run my Tensorflow models. Currenly, TensorRT supports Caffe prototype network descriptor files.

I could not find the source code for converting Tensorflow models to Caffe models. Are there any workarounds?

+6
source share
2 answers

TensorRT 3.0 supports the import / conversion of TensorFlow charts through UFF (universal frame format). Some layer implementations are missing and will require custom implementations through the IPlugin interface.

Previous versions did not support the native import of TensorFlow models / control points.

What you can also do is export the layer / network description into your own intermediate format (for example, a text file), and then use the TensorRT C ++ API to plot the output. You will have to export the scales / deviations of the convolutions separately. Be sure to pay attention to the weight format - TensorFlow uses NHWC, while TensorRT uses NCHW. And for weights, TF uses RSCK ([filter_height, filter_width, input_depth, output_depth]), and TensorRT uses KCRS.

See this article for an extended discussion of tensor formats: https://arxiv.org/abs/1410.0759

This link also contains useful information: https://www.tensorflow.org/versions/master/extend/tool_developers/

+11
source

No workarounds are currently required since the new TensorRT 3 TensorFlow support.

+1
source

Source: https://habr.com/ru/post/1013149/


All Articles