You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 18, 2024. It is now read-only.
However, when testing the monodepth network with DPT-large, I found the inference is not fast (in the for loop, each image's depth estimation takes around 0.6 seconds). And the wired thing is, for each image, the time-consuming part is either forward (takes around 0.5s) or cpu() (takes around 0.3s). And the strange thing is that, when one step is slow, the other one will be relatively fast. This will make the average inference speed to be around 0.4 seconds.
Do you have any idea why they are so time-consuming? And what caused the strange phenomenon that the two steps wouldn't be both fast?
The text was updated successfully, but these errors were encountered:
I'm extremely sorry to ask you so many questions.
However, when testing the monodepth network with DPT-large, I found the inference is not fast (in the for loop, each image's depth estimation takes around 0.6 seconds). And the wired thing is, for each image, the time-consuming part is either forward (takes around 0.5s) or cpu() (takes around 0.3s). And the strange thing is that, when one step is slow, the other one will be relatively fast. This will make the average inference speed to be around 0.4 seconds.
Do you have any idea why they are so time-consuming? And what caused the strange phenomenon that the two steps wouldn't be both fast?
The text was updated successfully, but these errors were encountered: