mev-yolodeep (1.1.13)
Installation
pip install --index-url mev-yolodeep
About this package
DEEPSTREAM
references
- https://darlingevil.com/riding-nvidias-slipstream-with-python/
- https://github.com/MegaMosquito/slipstream
- https://github.com/marcoslucianops/DeepStream-Yolo/tree/master#basic-usage
- https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/blob/master/notebooks/deepstream_test_1.ipynb
- https://github.com/ml6team/deepstream-python
- https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/blob/master/apps/deepstream-test1/deepstream_test_1.py
- Tracker
requirements
deepstream requires > 6.0
nvidia-smi --query-gpu=compute_cap --format=csv
compile deepstream yolov8 library and *.engine
For detection: create nvdsinfer libraries (deepstream 6.3 container)
task build
task cli:run
cd yolodeep/assets/nvdsinfer/
make
after running make
you should find 2 new libraries:
/runtime/models/yolov8-6.3.so
/runtime/models/yolov8-6.3-cls.so
Check and modify
env.sample
according to your local configuration and create the.env
file:cp env.sample .env
NB: copy the content ofmodels
in the root folder toyolodeep/models
or modify thedocker-compose.yml
to mount the volume../models:/models
(which is defined inenv.sample
) Remember to remove all.engine
file in../models/
run example
deepstream app
cd /DeepStream-Yolo/
deepstream-app -c deepstream_app_config.txt
or test python bindings (yolodeep/assets runtime required)
helloworld --show
run yolodeep
yolodeep [--pipeline default] [--debug] [--show]
yolodeep -p demo --show
yolodeep -p demo --key
yolodeep -p demo --rtsp # and in another shell task rtsp
make diagram
GST_DEBUG_DUMP_DOT_DIR=/tmp yolodeep <bla bla>
dot -Tpng /tmp/pipeline.dot > /samples/pipeline.png
convert model from pt to trt engine
convert *pt to onnx (ultralitycs container or host environment)
# cd models && wget https://github.com/ultralytics/assets/releases/download/v0.0.0/yolov8s.pt
cd yolodeep/DeepStream-Yolo/
python utils/export_yoloV8.py -w models/yolov8s.pt --size 640 --simplify --dynamic
mv yolov8s.onnx models/yolov8s-ds.onnx
convert onnx to engine (deepstream container) docs
task cli
trtexec [--fp16] --onnx=/models/yolov8s-ds.onnx --saveEngine=/models/yolov8s-ds.engine [--maxShapes=input:2x3x640x640] [--shapes=input:2x3x640x640]
todo new lighter container (not working)
download these files and put them into x86_64 folder
downloads/
├── deepstream-6.3_6.3.0-1_amd64.deb
└── nv-tensorrt-local-repo-ubuntu2004-8.5.3-cuda-11.8_1.0-1_amd64.deb
wget --content-disposition '<https://api.ngc.nvidia.com/v2/resources/org/nvidia/deepstream/6.3/files?redirect=true&path=deepstream_sdk_v6.3.0_x86_64.tbz2>' -O deepstream_sdk_v6.3.0_x86_64.tbz2
wget --content-disposition '<https://api.ngc.nvidia.com/v2/resources/org/nvidia/deepstream/6.3/files?redirect=true&path=deepstream-6.3_6.3.0-1_amd64.deb>' -O deepstream-6.3_6.3.0-1_amd64.deb
WAMP
To test the WAMP communication just clone this repo and follow the instructions.