Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference Time #22

Open
AnasAlsharo opened this issue Oct 29, 2024 · 1 comment
Open

Inference Time #22

AnasAlsharo opened this issue Oct 29, 2024 · 1 comment

Comments

@AnasAlsharo
Copy link

Hi.

I am running the inference using app.py. The inference would take 30 seconds in the first run for the image I provide. However, it starts to take longer and sometimes up to 4 minutes for the same image when I run the inference again. I clear the GPU memory and keep facing the same issue of increased run time.

@haodong2000
Copy link
Collaborator

Hi @AnasAlsharo , thanks for interests!

The inference time comparison in Fig. 3 (https://arxiv.org/html/2409.18124v4/x4.png) is conducted w/o any I/O process including the loading of the model.

At the first run, the app will download all the diffusion ckpts, which may take some time.

And in the following runs, the inference should be faster than the first run~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants