Roboflow provides a Docker container for use with TensorRT. This container supports inference using Roboflow models for object detection, classification, and instance segmentation tasks.
Roboflow enterprise customers can use the Roboflow TensorRT Docker container for inference. To use the inference server, you will need to download the nvidia-container-runtime environment. View the full TensorRT documentation for more information. You can run inference for the following tasks:
Join 100k developers curating high quality datasets and deploying better models with Roboflow.Get started
Browse Roboflow Learn for curated learning resources that will help you advance your understanding of computer vision.Explore Roboflow Learn