Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update text_generation.py #938

Closed
wants to merge 1 commit into from
Closed

Conversation

imrankh46
Copy link

i avoid this warning.
packages/llmcompressor/transformers/finetune/session_mixin.py:95: FutureWarning: tokenizer is deprecated and will be removed in version 5.0.0 for Trainer.__init__. Use processing_class instead.

SUMMARY:
"replace tokenizer by processing_class in trainer"

i avoid this warning.
packages/llmcompressor/transformers/finetune/session_mixin.py:95: FutureWarning: `tokenizer` is deprecated and will be removed in version 5.0.0 for `Trainer.__init__`. Use `processing_class` instead.
@kylesayrs
Copy link
Collaborator

kylesayrs commented Nov 26, 2024

Thank you very much for the contribution @imrankh46 !
This should be followed up with a more complete genericization of tokenizer -> processor, but this is a good start to get rid of the pesky warning.

  1. Could you please rebase and sign your commits to pass the DCO
  2. Please change src/llmcompressor/transformers/finetune/session_mixin.py:489
processor = self.processing_class if hasattr(self, "processing_class") else self.tokenizer
if processor is not None:
    processor.save_pretrained(output_dir)

Thanks again!

@kylesayrs kylesayrs added the ready When a PR is ready for review label Nov 26, 2024
@kylesayrs kylesayrs self-assigned this Nov 26, 2024
@kylesayrs kylesayrs mentioned this pull request Dec 2, 2024
@kylesayrs
Copy link
Collaborator

Handled by #955

@kylesayrs kylesayrs closed this Dec 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready When a PR is ready for review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants