Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sft learn to generate eos token #494

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

digitalSquirrel1
Copy link

Avoid 'eos_token' from being labeled as -100 and stopping contribute to loss.
one-line changing in sft.py.
correlated issue: #492

sft learn to generate eos token
@digitalSquirrel1
Copy link
Author

transformers.data.data_collator.DataCollatorForLanguageModeling:

image

@qgallouedec
Copy link
Member

Thanks!

@javirandor
Copy link

javirandor commented Mar 11, 2025

@digitalSquirrel1 This works only when starting SFT from Qwen/Qwen2.5-Math-7B-Instruct. Other models have the same EOS and PAD token e.g. Qwen/Qwen2.5-Math-7B (see here) so this will not work.

@Tim-Siu
Copy link

Tim-Siu commented Mar 25, 2025

@digitalSquirrel1 This works only when starting SFT from Qwen/Qwen2.5-Math-7B-Instruct. Other models have the same EOS and PAD token e.g. Qwen/Qwen2.5-Math-7B (see here) so this will not work.

I think we normally would replace the tokenizer_config of base models with their instruct counterparts?

@lewtun
Copy link
Member

lewtun commented Mar 25, 2025

Thanks for catching this issue with using the data collator and pad tokens @digitalSquirrel1 ! I think a more robust solution would be to expose pad_token in SFTConfig on the TRL side as this would allow users to configure it per model. For example, for Qwen models, one could then use some a prexisting, but irrelevant token like "<|fim_pad|>" as the pad token.

Since we may also wish to specify other special tokens like EOS, the ideal data structure would be to expose a special_tokens dictionary in SFTConfig so we can use it in the configs like this:

special_tokens:
  pad_token: "<|fim_pad|>"

Would you like to open a PR on TRL for this? cc @qgallouedec for viz

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants