Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider updating SLSQP to a modern and maintained implementation #303

Open
ewu63 opened this issue Jun 4, 2022 · 4 comments
Open

Consider updating SLSQP to a modern and maintained implementation #303

ewu63 opened this issue Jun 4, 2022 · 4 comments
Labels
maintenance This is for maintaining the repo

Comments

@ewu63
Copy link
Collaborator

ewu63 commented Jun 4, 2022

Description of feature

Currently, the version of SLSQP provided is quite old, and suffers from several bugs that have been fixed elsewhere. See #301 for some discussion. Since SLSQP remains a rather popular optimizer, to maintain long term viability, I think it would be best to switch to using a version that is better maintained. This would also avoid any duplication in maintenance efforts.

Potential solution

As far as I'm aware, there are three versions out there:

  • Scipy: well maintained and widely available, plus we already depend on scipy so there will be no additional dependencies. However seems to lack things such as fetching the optimal Lagrange multipliers that exist in pyOptSparse (though what we have might be broken, I don't really remember)
  • slsqp: much more modern than the old F77 code, seems to be very well maintained. Lacks Python interface
  • NLopt: given that it's built into an entire optimization framework, we will not consider this option further

This thread will serve as a place to discuss future plans regarding SLSQP.

@ewu63 ewu63 added the maintenance This is for maintaining the repo label Jun 4, 2022
@andrewellis55
Copy link
Contributor

@ewu63 Question for you on this - does pyOptSparse add any measurable advantage over just using SciPy minimize for SLSQP? If the end result is using a dense Jacobean anyway. I understand the usefulness of the common interface between various optimizers, but as an OpenMDAO user where the ScipyOptimizeDriver is available, in your opinion is there any advantage to using pyOptSparse for SLSQP?

@ewu63
Copy link
Collaborator Author

ewu63 commented Sep 24, 2024

@ewu63 Question for you on this - does pyOptSparse add any measurable advantage over just using SciPy minimize for SLSQP? If the end result is using a dense Jacobean anyway. I understand the usefulness of the common interface between various optimizers, but as an OpenMDAO user where the ScipyOptimizeDriver is available, in your opinion is there any advantage to using pyOptSparse for SLSQP?

It's not clear to us - we did some benchmarking years ago and did not find conclusively whether our version or SciPy's version is superior. @marcomangano can add more on this probably. Like you said, key features are generally not supported since SLSQP is a very old and little-maintained optimizer that has been pretty static, so I don't expect performance to differ too much between versions, and the bounds issue is the only notable divergent aspect that I am aware of.
If you are interested in advanced features such as sparsity or failure handling, IPOPT or SNOPT is probably your best bet.

@marcomangano
Copy link
Contributor

I don't have much to add to what @ewu63 said. I did some benchmarking at the beginning of my PhD using the Rosenbrock, Sellar, and the scalable function on Sec.5.4 of this paper by Tedford and Martins, also testing different MDO architectures. Maybe there is value in resurrecting these scripts?
Anyway, I recall negligible discrepancies except for some corner cases where SciPy was more robust, so I would suggest using that version if you want to use SLSQP. It looks like even the NLOPT implementation is largely based on SciPy's, which at this point I believe is the reference one.

I also agree on @ewu63's last point. I think the value of SLSQP is its simplicity and ease of use, but I would not rely on it for complex optimization problems. It has a very simple termination criteria and failure handling approach, among other things. There are better SQP-based optimizers out there. SNOPT is our go-to but it is a commercial code, so you need to obtain a licence. IPOPT has shown some great performance but might have a steeper learning curve and tweaking the options to work for your case might take some more trial and error.

@ewu63
Copy link
Collaborator Author

ewu63 commented Oct 3, 2024

FWIW, there is yet another new SLSQP package from Prof. Hwang's group: repo and paper. If they are able to commit to long-term maintenance of this package, it may be the best option going forward.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
maintenance This is for maintaining the repo
Projects
None yet
Development

No branches or pull requests

3 participants