iwatake2222/play_with_tensorrt

Sample projects for TensorRT in C++

46
/ 100
Emerging

This project provides sample C++ code and project structures to help developers implement high-performance deep learning inference on NVIDIA GPUs using TensorRT. It takes image or video files, or live camera feeds, and processes them with pre-trained models. This is ideal for C++ developers building GPU-accelerated applications that require fast, efficient execution of AI models.

197 stars. No commits in the last 6 months.

Use this if you are a C++ developer needing a clear, multi-platform example to integrate TensorRT into your applications for accelerated AI model inference.

Not ideal if you are not a C++ developer or are looking for a high-level Python library for deep learning inference.

deep-learning-inference GPU-acceleration C++-development edge-AI computer-vision
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

197

Forks

34

Language

C++

License

Apache-2.0

Last pushed

Feb 17, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/iwatake2222/play_with_tensorrt"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.