Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Paper and distillation loss #8

Open
ankardon opened this issue Dec 12, 2023 · 2 comments
Open

Paper and distillation loss #8

ankardon opened this issue Dec 12, 2023 · 2 comments

Comments

@ankardon
Copy link

Hello, I find your work highly interesting and would like to cite it, when can we expect the paper to be published?

Furthermore, I was curious about the distillation loss. The configuration for the model that achieved 27.1% mIoU on the hidden test set is ssc.yaml, correct? If so, is it correct, that no distillation was used in the training of that model, as MODEL.DISTILLATION is set to false in the yaml file?

@jdgalviss
Copy link
Owner

Hello, Thank you so much for looking at this repo. Unfortunately, I don't think it will be published, as it was just a project for one of my Master's courses. I can share the report with you, though.

Indeed, the best results on the Semantic Kitti benchmark test set were reached without distillation loss. Unintuitively, when training on the validation set, I saw a small improvement, but this was not the case for the test set. Moreover, distillation loss slows training since we have to add an additional forward pass of the teacher model. There might be something wrong with my implementation of the distillation loss as I haven't tested it enough.

@leungMr
Copy link

leungMr commented Dec 12, 2024

@jdgalviss Thank you for sharing your code. It is highly valuable and might be helpful for my research. I was wondering if there is any corresponding report or document available that could help me better understand the implementation logic of the code. If possible, could you kindly share it with me via email at [email protected]? Your work is greatly appreciated, and I thank you in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants