Adapter Override Warning During Trainer.Predict() #671
Answered
by
calpt
JosephGatto
asked this question in
Q&A
-
I am noticing whenever I call "trainer.predict" after training an adapter model (I am using BART for seq2seq), I get the following warning:
Is this safe to ignore? All I am doing is running Seq2SeqAdapterTrainer.train(), then immediately calling trainer.predict() |
Beta Was this translation helpful? Give feedback.
Answered by
calpt
Apr 14, 2024
Replies: 1 comment 2 replies
-
Depending on your exact training setup, this can be expected. E.g., if you use |
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
calpt
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Depending on your exact training setup, this can be expected. E.g., if you use
load_best_model_at_end=True
to re-load the best checkpoint after training has completed, the existing adapter will be overwritten.