Skip to content

Differentiating the dual variable of a linear program #261

Closed
@innuo

Description

@innuo

Say I have a function solves linear program whose parameters are based on the arguments to the function. The function returns a sum of the norm of the optimal primal solution and the norm of the optimal dual vector of the constraints. Can I use DiffOpt to find the gradient of the return value with respect to the arguments?

I looked through the docs and the examples but it wasn't clear to me how to do this or if it is possible. Any pointers would be greatly appreciated.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions