Integrations

Roboflow + NVIDIA TRT

Roboflow provides a Docker container for use with TensorRT. This container supports inference using Roboflow models for object detection, classification, and instance segmentation tasks.

Roboflow enterprise customers can use the Roboflow TensorRT Docker container for inference. To use the inference server, you will need to download the nvidia-container-runtime environment. View the full TensorRT documentation for more information. You can run inference for the following tasks:

  1. Object detection;
  2. Instance segmentation;
  3. Image classification and;
  4. OpenAI CLIP.
Learn More

View Content Related to NVIDIA TRT

We haven't written any guides yet on how to use Roboflow with NVIDIA TRT. If you have a question about using Roboflow and NVIDIA TRT, let us know on our Discussion Forum.

View More Deployment Integrations

Deploy a computer vision model today

Join 800,000+ developers curating high quality datasets and deploying better models with Roboflow.

Get started

Build your computer vision skills

Browse Roboflow Learn for curated learning resources that will help you advance your understanding of computer vision.

Explore Roboflow Learn