Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Input size #203

Open
maryamag85 opened this issue Feb 27, 2025 · 3 comments
Open

Input size #203

maryamag85 opened this issue Feb 27, 2025 · 3 comments

Comments

@maryamag85
Copy link

what is the max input size that distilled model can handle? if the original model does not have a limit on the input size, does it also preserve by distillation?

@stephantul
Copy link
Member

Hello @maryamag85 ,

I'm assuming you are talking about sequence length. The sequence length of our models is unbounded. The default, however, is 512, and can be controlled by setting max_length when encoding. Set it None to always encode the full sequence.

Stéphan

@shreyaskarnik
Copy link

When using StaticModelForClassification how do I set max_length ?

@stephantul
Copy link
Member

Hello @shreyaskarnik

It is a parameter of the predict function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants