Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nvidia Tesla p40 24GB #1374

Closed
LakoMoor opened this issue Oct 16, 2023 · 3 comments
Closed

Nvidia Tesla p40 24GB #1374

LakoMoor opened this issue Oct 16, 2023 · 3 comments

Comments

@LakoMoor
Copy link

Hello! Has anyone used GPU p40? I'm interested to know how many tokens it generates per second. Preferably on 7B models.

@WoosukKwon
Copy link
Collaborator

Hi @LakoMoor, unfortunately vLLM only supports Volta or later GPUs. P40 is not officially supported.

@LakoMoor
Copy link
Author

Oh... it's very sad... I just bought 10 gpus...((

@yukimakura
Copy link

#963

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants