Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extremely large size npz files during SuperPoint export_predictions #65

Open
Gpetrak opened this issue Jun 23, 2022 · 1 comment
Open

Comments

@Gpetrak
Copy link

Gpetrak commented Jun 23, 2022

Hi developers,

I'm trying to train Hfnet. During distillation, the exported npz files exceeded the 100 Gb in 20 minutes. As a result the process stopped because no space left on my device. My dataset is about 5 Gb of 50 000 images from Berkley DeepDrive dataset.
The command that I'm using is the following:

python hfnet/hfnet/export_predictions.py hfnet/hfnet/configs/superpoint_export_distill.yaml superpoint_predictions --keys local_descriptor_map,dense_scores --as_dataset

Is it reasonable or am I doing something wrong ?

@Gpetrak Gpetrak changed the title Extremelly large size npz files during SuperPoint export_predictions Extremely large size npz files during SuperPoint export_predictions Jun 23, 2022
@Amrmesi
Copy link

Amrmesi commented Nov 23, 2024

I have the same problem

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants