-
-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Video2Text Inference is slow and high vram consumption #116
Comments
Could you share the specs of your machine? |
I would recommend:
|
Hi Prince, My testing Machine is: M3 Max 128G ram. Thanks, |
Ok, Thanks. |
Awesome! It should work fine if you just lower the resolution. I have M3 Max with 96GB URAM. I can run this example in under a minute: |
My pleasure! |
Hi,
I want to process a 90-seconds video, but the memory is overflow. Is there any solution to decrease the vram consumption?
Thanks.
The text was updated successfully, but these errors were encountered: