Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Distillation on a differenta dataset using a trained teacher. #5

Open
Gharibw opened this issue Aug 25, 2020 · 2 comments
Open
Labels
question Further information is requested

Comments

@Gharibw
Copy link

Gharibw commented Aug 25, 2020

Thanks for this amazing work. The current command line example shows how to create a model using the kd process and a single dataset--ie.e, CIFAR10. However, I am trying to create a student model (using its own dataset) via distillation from a teacher model that was trained on a very similar but different dataset. Any guidance on how to accomplish this would be greatly appreciated.

@fruffy
Copy link
Collaborator

fruffy commented Aug 26, 2020

We mostly focussed on CIFAR10 and 100 so it might be fairly tricky to extend our scripts to allow arbitrary data sets.
You can hook up your own dataset here by replacing the get_cifar function call with your own here.

But there is no guarantee it will work since you will also likely have to customize how a teacher and student is loaded here and here.

I think at this point it might be easier to take the kd trainer here and use it with your own scripts.

@fruffy fruffy added the question Further information is requested label Aug 26, 2020
@Gharibw
Copy link
Author

Gharibw commented Sep 10, 2020

Thanks for your response!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants