Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add IPOPT Optimizer #111

Draft
wants to merge 20 commits into
base: dev
Choose a base branch
from
Draft

Add IPOPT Optimizer #111

wants to merge 20 commits into from

Conversation

schmoelder
Copy link
Contributor

@schmoelder schmoelder commented Mar 14, 2024

This PR adds support for interior point optimzer IPOPT using cyipopt.

@flo-schu I guess this still needs some work but it's been in a local branch for a long time and maybe it would be a good opportunity to include it in the test runners and maybe even for the optimization study.

Could you please jog my memory and briefly tell me what is required to run the test_optimizer_behaviour using pytest?

@schmoelder schmoelder requested a review from flo-schu March 14, 2024 10:47
@schmoelder schmoelder changed the title Add IOPOT Optimizer Add IPOPT Optimizer Mar 14, 2024
@schmoelder schmoelder force-pushed the dev branch 4 times, most recently from f778466 to 0534130 Compare March 16, 2024 12:45
@flo-schu
Copy link
Collaborator

The new optimizer needs to be inserted here:

@pytest.fixture(params=[
    SLSQP,
    TrustConstr,
    U_NSGA3,
    GPEI,
    NEHVI,
])
def optimizer(request):
    return request.param()

that's it. Then call the test with pytest tests/test_optimizer_behavior.py::test_from_initial_values

By the way, in dev currently both, test_from_initial_values and test_convergence are used. Both tests are almost the same and I don't think the test_convergence test is necessary.

ronald-jaepel and others added 7 commits June 18, 2024 21:13
Previously, there were two interfaces in the `OptimizationProblem` for calling evaluation functions (e.g. objectives): one for evaluating individuals, and one for populations.
To simplify the code base, these two methods were now unified.
To ensure backward compatibility, a 1D-Array is returned if a single individual is passed to the function.
@ronald-jaepel ronald-jaepel marked this pull request as draft June 19, 2024 08:44
@schmoelder
Copy link
Contributor Author

@hannahlanzrath, this is another optimizer that uses gradient information. This is WIP, but maybe you'd be up for testing it a bit, especially when considering how Jacobians (and Hesse) matrices computed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants