Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Differentiating the dual variable of a linear program #261

Closed
innuo opened this issue Dec 18, 2024 · 5 comments
Closed

Differentiating the dual variable of a linear program #261

innuo opened this issue Dec 18, 2024 · 5 comments

Comments

@innuo
Copy link

innuo commented Dec 18, 2024

Say I have a function solves linear program whose parameters are based on the arguments to the function. The function returns a sum of the norm of the optimal primal solution and the norm of the optimal dual vector of the constraints. Can I use DiffOpt to find the gradient of the return value with respect to the arguments?

I looked through the docs and the examples but it wasn't clear to me how to do this or if it is possible. Any pointers would be greatly appreciated.

@innuo
Copy link
Author

innuo commented Jan 8, 2025

Thoughts on this issue? It would be nice to be able to differentiate both the primal and dual solution with respect to the parameters of the problem.

I'll be grateful for any help.

@blegat
Copy link
Member

blegat commented Jan 8, 2025

You can use DiffOpt to differentiate the primal, the compute the dual with Dualization and then use DiffOpt on that dual model

@innuo
Copy link
Author

innuo commented Jan 9, 2025

You can use DiffOpt to differentiate the primal, the compute the dual with Dualization and then use DiffOpt on that dual model

Thanks. That was what I was going to do, but that needlessly doubles the time required. The duals are produced as a side effect with the first solve anyways and it would be nice to be able to calculate the gradient with the information already computed. (This is possible as shown in https://arxiv.org/abs/1703.00443)

@joaquimg
Copy link
Member

joaquimg commented Jan 9, 2025

That is not currently available, but it is possible to implement that in DiffOpt. PRs are welcome.

@joaquimg
Copy link
Member

this is also a duplicate of #94

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants