Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model Inference Speed #154

Open
RAJA-PARIKSHAT opened this issue Dec 9, 2021 · 3 comments
Open

Model Inference Speed #154

RAJA-PARIKSHAT opened this issue Dec 9, 2021 · 3 comments

Comments

@RAJA-PARIKSHAT
Copy link

I am running image to video face swap and its taking at least five minutes for processing single pair. Although I have enabled gpu in colab but still. Can someone give me their approximate of inference? @YuvalNirkin @SajjadAemmi

@SajjadAemmi
Copy link
Contributor

Hi @RAJA-PARIKSHAT
How long is your video?

@RAJA-PARIKSHAT
Copy link
Author

I got the point of length of video, now I am processing a video of 12 seconds but still it takes around 2 minutes on google co laboratory to process it. @SajjadAemmi. Any suggestion you can give if we want to run in real time

@SajjadAemmi
Copy link
Contributor

@RAJA-PARIKSHAT this project is not real-time, not at all

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants