Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Production inference quality extremely low #219

Open
robrohan opened this issue Dec 3, 2024 · 0 comments
Open

Production inference quality extremely low #219

robrohan opened this issue Dec 3, 2024 · 0 comments

Comments

@robrohan
Copy link

robrohan commented Dec 3, 2024

Hello - I have to say this project is amazing. You've done some incredible work.

I've been playing around with the model and had some great results when running directly in python in "training mode" (aka just loading the checkpoints and doing inference). However, when I try playing with the model in script mode (which, of course, doesn't work because of types, so using trace()), and serving it from a production inference server, the results are not even close to what it would create directly from python.

I've tried other people's onnx "builds" as well, with the same results - max image size 518, and the depth map is unusable. The depth map looks just like a splatter of black and white - almost random looking.

Is there a tutorial or somewhere I can look for help on trying to run this model in a production environment? Or is it a know issue that you have to run this model "unfrozen" directly in python via the checkpoints?

Thank you for your help, and congratulations on such a great piece of work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant