Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QUESTION] Apply tutorial to fine_tune output #8

Closed
franferraz98 opened this issue Jun 21, 2023 · 4 comments
Closed

[QUESTION] Apply tutorial to fine_tune output #8

franferraz98 opened this issue Jun 21, 2023 · 4 comments

Comments

@franferraz98
Copy link

Hi,
I'm pretty new to FastAI and just got to know FasterAI, I was trying to apply the concepts from the Get Started tutorial to my own code.
Currently I'm doing something like the following:

from fastai.vision import all as fst
from fastai.learner import Learner
from fastai.vision.learner import vision_learner
from fastai.tabular.core import df_shrink
# Code to get the datasets...

# Create dataloaders
data_loaders = data_block.dataloaders(tmp_dir, bs=config["batch_size"])  # Config is a dictionay containing useful info
# Create model
base_model = getattr(fst, config["model"])
model = vision_learner(data_loaders, base_model, pretrained=True, lr=0.001,
                       metrics=fst.error_rate)

# Launch training
model.fine_tune(epochs)

And this works fine, I'm able to save and use the model with no issues. But then when I try to do Knowledge Distillation:

from fasterai.distill.all import *
from fastai.vision.all import *
# FasterAI tutorial replica
student = Learner(data_loaders, getattr(fst, config["model"]),
                  metrics=[accuracy])
kd_cb = KnowledgeDistillationCallback(model, SoftTarget)
student.fit_one_cycle(10, 1e-4, cbs=kd_cb)

I'm getting the following error:

File "/home/deeplearning/workspace/refactor/.venv/lib/python3.9/site-packages/fastai/torch_core.py", line 649, in trainable_params
    return [p for p in m.parameters() if p.requires_grad]
AttributeError: 'function' object has no attribute 'parameters'

Other important stuff:

  • Python version: 3.9
  • FastAI version: 2.7.11
  • FasterAI version: 0.1.13

Can you please help me?

@nathanhubens
Copy link
Collaborator

Hi @franferraz98 !

When using the fastai class vision_learner you actually are creating a fastai Learner. In this learner, you can access to the PyTorch model with learner.model.

The KnowledgeDistillationCallback requires the PyTorch model as a teacher and not the actual fastai learner.

It seems that you may have confused both as you are naming your learner "model" in the line model = vision_learner(data_loaders, base_model, pretrained=True, lr=0.001, metrics=fst.error_rate) in this case, you would need to pass model.model to the FasterAI callback. To avoid any confusion, I would suggest you to name the first model teacherand pass teacher.model to Knowledge Distillation.

I hope it clarifies the issue and also hope that will actually solve it. 😊

Do not hesitate if you have other questions,

Nathan

@franferraz98
Copy link
Author

franferraz98 commented Jun 22, 2023

Hi @nathanhubens,

Thanks for the quick response! I'm getting the same error by using model.model, if it's of importance I'm using a Resnet18 model (config["model] == "resnet18"). I've checked through the debugger and, as expected, there is no 'parameters' into model.model.

Edit: Actually it's a protected attribute?
image

Thank you!

Fran.

@nathanhubens
Copy link
Collaborator

Hi again @franferraz98,

I think I know what is going on, in fastai the vision_learner expects a model function to create the learner (e.g. vision_learner(dls, resnet18)) on the other hand, the Learner expects a model (e.g. Learner(dls, resnet18())). Notice the subtle difference between resnet18 and resnet18()

You should thus find a way to replace resnet18 by resnet18() when using Learner.

For example, this can be achieved by doing m = globals()[config["model"]] and then student = Learner(dls, m(num_classes, ...))

Hope that solves it !

@franferraz98
Copy link
Author

Hi @nathanhubens,
You are exactly right, it's working now. Thanks a lot! I'll reach back to you if any other issues come up.
I'll close the ticket.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants