Replies: 2 comments 1 reply
-
Please try with slice_size=640. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Have you solved your problem? I have the same problem as you. The original size of my data set is 54723648, and the AP50 detected is only 15. When I reduce the image size to 28002800, the AP50 is more than 40. When I further reduce the image size, the accuracy will be further improved. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is there some kind of limitation or special handling of detections in very large gigapixel images?
For a specific image I got no detections using get_sliced_predictions with Yolov5. However, if a run the detection on the individual tiles with the same tile size used in get_sliced_predictions, I get a lot of good detections. So I tried to divide the large image in half, and then I begin to get detections in the individual image parts but some of the detections are false (seems to be rather large random objects). Then I divided the image in four parts and then almost all detections are good. I tried all combinations of postprocess_type, postprocess_match_metric and perform_standard_pred, but I did not see any improvements on the very large image. I am using the models native resolution as tile_size with 0.1 overlap. I also tried with no overlap but no improvement.
Beta Was this translation helpful? Give feedback.
All reactions