Inference results on CPU and GPU are different #1858
Unanswered
nakayamarusu
asked this question in
Q&A
Replies: 1 comment 3 replies
-
Hi, this seems quite unfortunate, especially as it seems to work on CPU. We'll need to ask someone who better understands OpenVINO gpu inference @adrianboguszewski do you maybe know how to proceed here? |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Problem
Using Padim to learn images of MVTec bottles. And after exporting it to a file that can be used in Openvino, I did the inference in Openvino. I can get the correct heatmap if I infer on the CPU, but it doesn't work on the GPU. The GPU uses Intel iRIS Xew, and we have confirmed that OpenVino is compatible.
detail
Train
The learning process was as follows.
anomalib fit -c configs/model/padim.yaml --data configs/folder_bottle.yaml
▼ padim.yaml
▼ folder_bottle_yaml
Export
The export was performed as follows.
anomalib export --model Padim --export_type OPENVINO --ckpt_path results/Padim/bottle/latest/weights/lightning/model.ckpt
Inference
The inference was made by creating python code as follows.
Inference result
Heat map when the device used for inference is CPU as shown below
▲ pred_score : 0.4836
▲ pred_score : 0.5605
Heat map when the device used for inference is GPU as shown below
▲ pred_score : 0.0
▲ pred_score : 0.0
Both CPU and GPU inference use the same model.
Also, only the device = "CPU" and device = "GPU" parts have been changed.
I changed the OpenVino version from 2024.0.0 to 2023.2.0 and tried inference, but it doesn't work
Question
If I just switch the device used from CPU to GPU, I cannot get correct inference results. How can we improve it to enable inference on GPU?
Beta Was this translation helpful? Give feedback.
All reactions