[Detector Support]: Yolov8 use in 0.14 Beta 3 #12011
Replies: 4 comments 19 replies
-
YOLOv8 support was removed due to licensing issues: #10717 |
Beta Was this translation helpful? Give feedback.
-
You should give YOLO-NAS a try instead: https://deploy-preview-11419--frigate-docs.netlify.app/configuration/object_detectors#yolo-nas |
Beta Was this translation helpful? Give feedback.
-
what is the impact of the size change from 320 to 256 or lower ? |
Beta Was this translation helpful? Give feedback.
-
Frigate14/Ovenvino/yolonas_s working well for me so far. On my test hardware I get an inference time of 60ms with the 320 model on a modest Celeron 4095 iGPU. With the same setup, Frigate13/Openvino/yolov8 inference time was 55ms, so slightly better while I was expecting yolonas to be a bit faster |
Beta Was this translation helpful? Give feedback.
-
Describe the problem you are having
I've been using 0.13 with the Yolov8n model with success. Trying to move across to 0.14 to try out the new UI and having trouble using the same model. The following config (extract) gets Frigate to start but after a few seconds it exits seemingly blaming detection (sorry about the lack of indentation in the paste, rest assured it's indented correctly and I just can't use github).
detectors:
ov:
type: openvino
device: AUTO
model:
width: 416
height: 416
input_tensor: nchw
input_pixel_format: bgr
model_type: yolox
path: /config/yolov8n.xml
labelmap_path: /config/coco_80cl.txt
The following part of the log seems relevant:
2024-06-17 09:38:42.881536981 Process detector:ov:
2024-06-17 09:38:42.883308687 Traceback (most recent call last):
2024-06-17 09:38:42.883374317 File "/usr/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
2024-06-17 09:38:42.883377573 self.run()
2024-06-17 09:38:42.883412652 File "/usr/lib/python3.9/multiprocessing/process.py", line 108, in run
2024-06-17 09:38:42.883415225 self._target(*self._args, **self._kwargs)
2024-06-17 09:38:42.883443932 File "/opt/frigate/frigate/object_detection.py", line 125, in run_detector
2024-06-17 09:38:42.883446712 detections = object_detector.detect_raw(input_frame)
2024-06-17 09:38:42.883486769 File "/opt/frigate/frigate/object_detection.py", line 75, in detect_raw
2024-06-17 09:38:42.883490184 return self.detect_api.detect_raw(tensor_input=tensor_input)
2024-06-17 09:38:42.883522448 File "/opt/frigate/frigate/detectors/plugins/openvino.py", line 155, in detect_raw
2024-06-17 09:38:42.883524995 infer_request.infer(input_tensor)
2024-06-17 09:38:42.883557073 File "/usr/local/lib/python3.9/dist-packages/openvino/runtime/ie_api.py", line 132, in infer
2024-06-17 09:38:42.883559339 return OVDict(super().infer(_data_dispatch(
2024-06-17 09:38:42.883604094 RuntimeError: Exception from src/inference/src/cpp/infer_request.cpp:116:
2024-06-17 09:38:42.883607221 Exception from src/inference/src/cpp/infer_request.cpp:66:
2024-06-17 09:38:42.883609966 Exception from src/plugins/intel_cpu/src/infer_request.cpp:367:
2024-06-17 09:38:42.883613283 ParameterMismatch: Failed to set tensor for input with precision: u8, since the model input tensor precision is: f32
2024-06-17 09:38:42.883615091
2024-06-17 09:38:42.883620285
2024-06-17 09:38:42.883622100
Anyone any thoughts ?
Cheers
Version
0.14 Beta 3
Frigate config file
See body of message
docker-compose file or Docker CLI command
N/A
Relevant log output
Operating system
Other Linux
Install method
Docker Compose
Object Detector
OpenVino
Any other information that may be helpful
No response
Beta Was this translation helpful? Give feedback.
All reactions