mev-yolodeep (1.1.13)

Published 2024-10-25 09:35:23 +02:00 by sscipioni

Installation

pip install --index-url  mev-yolodeep

About this package

DEEPSTREAM

references

requirements

deepstream requires > 6.0

nvidia-smi --query-gpu=compute_cap --format=csv

compile deepstream yolov8 library and *.engine

For detection: create nvdsinfer libraries (deepstream 6.3 container)

task build
task cli:run
cd yolodeep/assets/nvdsinfer/
make

after running make you should find 2 new libraries:

/runtime/models/yolov8-6.3.so
/runtime/models/yolov8-6.3-cls.so

Check and modify env.sample according to your local configuration and create the .env file: cp env.sample .env NB: copy the content of models in the root folder to yolodeep/models or modify the docker-compose.yml to mount the volume ../models:/models (which is defined in env.sample) Remember to remove all .engine file in ../models/

run example

deepstream app

cd /DeepStream-Yolo/
deepstream-app -c deepstream_app_config.txt

or test python bindings (yolodeep/assets runtime required)

helloworld --show

run yolodeep

yolodeep [--pipeline default] [--debug] [--show]

yolodeep -p demo --show
yolodeep -p demo --key
yolodeep -p demo --rtsp # and in another shell task rtsp 

make diagram

GST_DEBUG_DUMP_DOT_DIR=/tmp yolodeep <bla bla>
dot -Tpng /tmp/pipeline.dot > /samples/pipeline.png

convert model from pt to trt engine

convert *pt to onnx (ultralitycs container or host environment)

# cd models && wget https://github.com/ultralytics/assets/releases/download/v0.0.0/yolov8s.pt
cd yolodeep/DeepStream-Yolo/
python utils/export_yoloV8.py -w models/yolov8s.pt --size 640 --simplify --dynamic
mv yolov8s.onnx models/yolov8s-ds.onnx

convert onnx to engine (deepstream container) docs

task cli
trtexec [--fp16] --onnx=/models/yolov8s-ds.onnx --saveEngine=/models/yolov8s-ds.engine [--maxShapes=input:2x3x640x640] [--shapes=input:2x3x640x640]

todo new lighter container (not working)

download these files and put them into x86_64 folder

downloads/
├── deepstream-6.3_6.3.0-1_amd64.deb
└── nv-tensorrt-local-repo-ubuntu2004-8.5.3-cuda-11.8_1.0-1_amd64.deb
wget --content-disposition '<https://api.ngc.nvidia.com/v2/resources/org/nvidia/deepstream/6.3/files?redirect=true&path=deepstream_sdk_v6.3.0_x86_64.tbz2>' -O deepstream_sdk_v6.3.0_x86_64.tbz2

wget --content-disposition '<https://api.ngc.nvidia.com/v2/resources/org/nvidia/deepstream/6.3/files?redirect=true&path=deepstream-6.3_6.3.0-1_amd64.deb>' -O deepstream-6.3_6.3.0-1_amd64.deb

WAMP

To test the WAMP communication just clone this repo and follow the instructions.

Requirements

Requires Python: >=3.10
Details
PyPI
2024-10-25 09:35:23 +02:00
42
MIT
225 KiB
Assets (1)
Versions (3) View all
1.1.13 2024-10-25
1.1.12 2024-08-28
1.1.10 2024-05-30