Skip to content
This repository has been archived by the owner on Dec 18, 2024. It is now read-only.

Inference speed #59

Open
pengzhi1998 opened this issue Dec 10, 2021 · 0 comments
Open

Inference speed #59

pengzhi1998 opened this issue Dec 10, 2021 · 0 comments

Comments

@pengzhi1998
Copy link

pengzhi1998 commented Dec 10, 2021

I'm extremely sorry to ask you so many questions.

However, when testing the monodepth network with DPT-large, I found the inference is not fast (in the for loop, each image's depth estimation takes around 0.6 seconds). And the wired thing is, for each image, the time-consuming part is either forward (takes around 0.5s) or cpu() (takes around 0.3s). And the strange thing is that, when one step is slow, the other one will be relatively fast. This will make the average inference speed to be around 0.4 seconds.

Do you have any idea why they are so time-consuming? And what caused the strange phenomenon that the two steps wouldn't be both fast?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant