Can I add a composition Block of different adapter types? #688
-
Can i stack prefix tuning with bottleneck adapter? somehting like config unioin but as composition block. Here is my scenario: Let us suppose I have a trained prefix tuning or lora adapter that i want to load to my model and activate it, and i want to use the output of this adapter as an input to a stacked new bottlneck adapter that will be trained to a different task. So both the loaded lora adapter and the new adapter are activated, but only the new one is trained. Can i do that? here is my code:
[{'name': 'lora-mlm-target', But when I run evlauate or train it gives me this error:
What is the problem? is it not supported? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @MRawhani import adapters
model = adapters.AutoAdapterModel.from_pretrained("google-bert/bert-base-uncased")
model.add_adapter("a",config="seq_bn")
model.add_adapter("b",config="lora")
model.add_classification_head("b", num_labels=3)
model.active_adapters = adapters.composition.Stack("a", "b") This throws no error, and the adapter summary looks like this:
|
Beta Was this translation helpful? Give feedback.
Hi @MRawhani
Are you using the current library version? AFAIK we fixed something related to Stack some time ago. I couldn't replicate your problems with this:
This throws no error, and the adapter summary looks like this: