Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Failed to populate metadata on 72 samples. Use dataset.exists("metadata", False) to retrieve them #46

Open
harry-esmart opened this issue May 15, 2023 · 7 comments
Labels
bug Something isn't working

Comments

@harry-esmart
Copy link

Does anyone know why I faced this error: [BUG] Failed to populate metadata on 72 samples. Use dataset.exists("metadata", False) to retrieve them while calculating evaluate_detections?

Computing metadata...
100% |█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 72/72 [791.6ms elapsed, 0s remaining, 91.0 samples/s]
Failed to populate metadata on 72 samples. Use dataset.exists("metadata", False) to retrieve them
{
'samples_count': 72,
'samples_bytes': 510162,
'samples_size': '498.2KB',
'media_bytes': 0,
'media_size': '0.0B',
'total_bytes': 510162,
'total_size': '498.2KB',
}
Failed to populate metadata on 72 samples. Use dataset.exists("metadata", False) to retrieve them
Threshold: 0.5
Evaluating detections...
100% |█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 72/72 [1.8s elapsed, 0s remaining, 40.7 samples/s]
precision recall f1-score support

 flashed       0.00      1.00      0.00         1

contaminated 0.00 1.00 0.00 1

micro avg 0.00 1.00 0.00 2
macro avg 0.00 1.00 0.00 2
weighted avg 0.00 1.00 0.00 2

@harry-esmart harry-esmart added the bug Something isn't working label May 15, 2023
@harry-esmart
Copy link
Author

Another question is that this problem also causes 0 in precision.

@brimoor
Copy link
Contributor

brimoor commented May 15, 2023

Have you inspected view = dataset.exists("metadata", False) to see what the problem is? For example, the image paths could be nonexistent.

This error is coming from dataset.compute_metadata(), which is internally used to populate things like image dimensions, which are necessary for certain computations. You could try directly running that a second time to see if those images are still unable to be processed.

@harry-esmart
Copy link
Author

Yeah, the image paths do not exist but that should not be affecting the evaluation computations right? As it only needs bboxes from ground-truth and prediction.

@brimoor
Copy link
Contributor

brimoor commented May 16, 2023

The evaluation protocols whatever method you're calling needs information about absolute image dimensions for certain things. For example, labels themselves are stored in FO as relative values, so the metadata field either needs to be pre-populated with ImageMetadata instances with their height and width populated, or the images need to exist on disk so that dataset.compute_metadata() can populate this data on-the-fly.

@harry-esmart
Copy link
Author

@brimoor So what you meant is that image paths are a prerequisite to calculating precision|recall|f1 correctly?

@brimoor
Copy link
Contributor

brimoor commented May 26, 2023

Actually I'm confused. What code are you running exactly? This works perfectly fine with non-existent filepaths:

import fiftyone as fo
import fiftyone.zoo as foz

dataset = foz.load_zoo_dataset("quickstart")

dataset.set_values("filepath", ["bad.jpg"] * len(dataset))

results = dataset.evaluate_detections(
    "predictions",
    gt_field="ground_truth",
    eval_key="eval",
    compute_mAP=True,
)

@harry-esmart
Copy link
Author

harry-esmart commented Jun 1, 2023

Hi @brimoor,
It is true that the code can be run with non-existent filepaths. But during its running, there is an warning/error saying Failed to populate metadata on 72 samples. Use dataset.exists("metadata", False) to retrieve them. And the calculated metric results are not correct, precision/recall/f1. I am not sure it is because of the warning/error above or something else that causes the problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants