Skip to content

Dockerized TensorRT inference engine with ONNX model conversion tool and ResNet50 and Ultraface preprocess and postprocess C++ implementation

Notifications You must be signed in to change notification settings

MrLaki5/TensorRT-onnx-dockerized-inference

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TensorRT-onnx-dockerized-inference

  • TensorRT engine inference with ONNX model conversion
  • Dockerized environment with: CUDA 10.2, TensorRT 7, OpenCV 3.4 built with CUDA
  • ResNet50 preprocessing and postprocessing implementation
  • Ultraface preprocessing and postprocessing implementation

Requirements

Build

Pull docker image

  • Pull container image from the repo packages
docker pull ghcr.io/mrlaki5/tensorrt-onnx-dockerized-inference:latest

Build docker image from sources

  • Download TensorRT 7 installation from link
  • Place downloaded TensorRT 7 deb file into root dir of this repo
  • Build
cd ./docker
./build.sh

Run

From the root of the repo start docker container with the command below

./docker/run.sh

ResNet50 inference test

./ResNet50_test
  • Input image
  • Output: Siamese cat, Siamese (confidence: 0.995392)

Ultraface detector inference test

  • Note: for this test, camera device is required. Test will start GUI showing camera stream overlaped with face detections.
./Ultraface_test

About

Dockerized TensorRT inference engine with ONNX model conversion tool and ResNet50 and Ultraface preprocess and postprocess C++ implementation

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages