You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to reproduce and play with the differentiable rendering example with the Cornell box described here: https://mitsuba2.readthedocs.io/en/latest/src/inverse_rendering/diff_render.html
All the steps described work, but if I e.g. want to try to differentiate/optimize OBJMesh_1.vertex_positions instead of red.reflectance.value, I get the following error when running the optimization:
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-17-437578f9b545> in <module>
10
11 # Back-propagate errors to input parameters
---> 12 ek.backward(ob_val)
13
14 # Optimizer: take a gradient step
RuntimeError: set_gradient(): no gradients are associated with this variable (a prior call to requires_gradient() is required.)
I tried all the Objects in the scene, but for none of them it is possible to optimize/differentiate with vertex_positions.
I find the error strange because the docs say the requires_gradient() call is automatically performed when initializing the Adam optimizer. Also if I do it manually before, it doesn't help...
So are there additional steps I should take to be able to optimize e.g. vertex_positions?
Thanks for helping! Pidgey
The text was updated successfully, but these errors were encountered:
Hey all
I'm trying to reproduce and play with the differentiable rendering example with the Cornell box described here: https://mitsuba2.readthedocs.io/en/latest/src/inverse_rendering/diff_render.html
All the steps described work, but if I e.g. want to try to differentiate/optimize
OBJMesh_1.vertex_positions
instead ofred.reflectance.value
, I get the following error when running the optimization:I tried all the Objects in the scene, but for none of them it is possible to optimize/differentiate with vertex_positions.
I find the error strange because the docs say the requires_gradient() call is automatically performed when initializing the Adam optimizer. Also if I do it manually before, it doesn't help...
So are there additional steps I should take to be able to optimize e.g. vertex_positions?
Thanks for helping! Pidgey
The text was updated successfully, but these errors were encountered: