-
Notifications
You must be signed in to change notification settings - Fork 266
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[🐛 bug report] Gradient value is empty with free_graph=False
#485
Comments
Hi @pjessesco , What's do you mean by "can't optimize"? Are the resulting gradients wrong? |
Hi @Speierers , sorry for ambiguous wording. I printed as
, gradient value is empty if the option is false. |
free_graph=False
free_graph=False
That is likely a bug in the enoki AD backend. Unfortunately we are working hard on the upcoming release of the library I won't probably have the time to look into this in the near future. In any case, could you send me the whole python snippet to reproduce this, I could take a brief look to see if there is anything obvious. |
Here is the script, it's almost the same with a bunny example except for the option. invert_bunny.py
|
This looks reasonable. It is likely a bug in the enoki AD backend then. |
Summary
Hi, I'm trying to do a differentiable rendering without
free_graph
option. It looks like if the option is enabled, the visited node is erased after used to calculate gradient while traversing the graph.My goal is to specify nodes' gradient in their label in graph (using
ek.graphviz
), so I don't want to let my graph be reduced while backpropagation.However I couldn't optimize parameter with
free_graph=False
... thanks for your reply.System configuration
master b92ddc2
Ubuntu
CUDA 11.2
Steps to reproduce
In
invert_bunny.py
example, fix as belowThe text was updated successfully, but these errors were encountered: