Skip to content

Commit

Permalink
Merge branch 'main' into paropt-wrapper
Browse files Browse the repository at this point in the history
  • Loading branch information
gjkennedy authored Oct 2, 2024
2 parents 8c5a27f + bc021e4 commit d82a685
Show file tree
Hide file tree
Showing 38 changed files with 832 additions and 254 deletions.
11 changes: 6 additions & 5 deletions .github/environment.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,19 @@
dependencies:
# build
- python >=3.8
- numpy >=1.16
- python >=3.9
- numpy >=1.21,<2
- ipopt
- swig
- meson =0.61
- meson >=1.3.2
- compilers
- pkg-config
- pip
- setuptools
- build
- packaging
# testing
- parameterized
- testflo
- scipy >1.2
- scipy >=1.7
- mdolab-baseclasses >=1.3.1
- sqlitedict >=1.6
- sqlitedict >=1.6
2 changes: 1 addition & 1 deletion .github/test_real.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,4 @@ cd tests
# we have to copy over the coveragerc file to make sure it's in the
# same directory where codecov is run
cp ../.coveragerc .
testflo --pre_announce -v --coverage --coverpkg pyoptsparse $EXTRA_FLAGS
testflo --pre_announce --disallow_deprecations -v --coverage --coverpkg pyoptsparse $EXTRA_FLAGS
2 changes: 1 addition & 1 deletion .github/workflows/windows-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ jobs:
- uses: actions/checkout@v2
- uses: conda-incubator/setup-miniconda@v2
with:
python-version: 3.8
python-version: 3.9
miniforge-variant: Mambaforge
channels: conda-forge,defaults
channel-priority: strict
Expand Down
2 changes: 1 addition & 1 deletion .zenodo.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"creators": [
{
"name": "Neil Wu"
"name": "Ella Wu"
},
{
"name": "Gaetan Kenway"
Expand Down
4 changes: 2 additions & 2 deletions doc/citation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Citation
========
If you use pyOptSparse, please cite the following paper:

N. Wu, G. Kenway, C. A. Mader, J. Jasa, and J. R. R. A. Martins. pyOptSparse: A Python framework for large-scale constrained nonlinear optimization of sparse systems. Journal of Open Source Software, 5(54), 2564, October 2020. https://doi.org/10.21105/joss.02564
E. Wu, G. Kenway, C. A. Mader, J. Jasa, and J. R. R. A. Martins. pyOptSparse: A Python framework for large-scale constrained nonlinear optimization of sparse systems. Journal of Open Source Software, 5(54), 2564, October 2020. https://doi.org/10.21105/joss.02564

The paper is available online from the Journal of Open Source Software `here <https://joss.theoj.org/papers/10.21105/joss.02564>`__.
To cite this paper, you can use the following BibTeX entry:
Expand All @@ -18,7 +18,7 @@ To cite this paper, you can use the following BibTeX entry:
volume = {5},
number = {54},
pages = {2564},
author = {Neil Wu and Gaetan Kenway and Charles A. Mader and John Jasa and Joaquim R. R. A. Martins},
author = {Ella Wu and Gaetan Kenway and Charles A. Mader and John Jasa and Joaquim R. R. A. Martins},
title = {pyOptSparse: A Python framework for large-scale constrained nonlinear optimization of sparse systems},
journal = {Journal of Open Source Software}
}
31 changes: 22 additions & 9 deletions doc/guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@ The optimization class is created using the following call:

.. code-block:: python
optProb = Optimization("name", objFun)
optProb = Optimization("name", objconFun)
The general template of the objective function is as follows:
The general template of the objective and constraint function is as follows:

.. code-block:: python
def obj_fun(xdict):
def objconFun(xdict):
funcs = {}
funcs["obj_name"] = function(xdict)
funcs["con_name"] = function(xdict)
Expand Down Expand Up @@ -196,17 +196,30 @@ This argument is a dictionary, and the keys must match the design variable sets
Essentially what we have done is specified the which blocks of the constraint rows are non-zero,
and provided the sparsity structure of ones that are sparse.

For linear constraints the values in ``jac`` are meaningful:
they must be the actual linear constraint Jacobian values (which do not change).
For non-linear constraints, only the sparsity structure (i.e. which entries are nonzero) is significant.
The values themselves will be determined by a call to the ``sens()`` function.

Also note, that the ``wrt`` and ``jac`` keyword arguments are only supported when user-supplied sensitivity is used.
Note that the ``wrt`` and ``jac`` keyword arguments are only supported when user-supplied sensitivity is used.
If automatic gradients from pyOptSparse are used, the constraint Jacobian will necessarily be dense.

.. note::
Currently, only the optimizers SNOPT and IPOPT support sparse Jacobians.

Linear Constraints
~~~~~~~~~~~~~~~~~~
Linear constraints in pyOptSparse are defined exclusively by ``jac``, ``lower``, and ``upper`` entries of the ``addConGroup`` method.
For linear constraint :math:`g_L \leq Ax + b \leq g_U`, the constraint definition would look like:

.. code-block:: python
optProb.addConGroup("con", num_cons, linear=True, wrt=["xvars"], jac={"xvars": A}, lower=gL - b, upper=gU - b)
Users should not provide the linear constraint values (i.e., :math:`g = Ax + b`) in a user-defined objective/constraint function.
pyOptSparse will raise an error if you do so.

For linear constraints, the values in ``jac`` are meaningful:
they must be the actual linear constraint Jacobian values (which do not change).
For non-linear constraints, only the sparsity structure (i.e. which entries are nonzero) is significant.
The values themselves will be determined by a call to the ``sens()`` function.


Objectives
++++++++++

Expand Down
4 changes: 2 additions & 2 deletions doc/optimizers/NLPQLP.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ solved. The line search can be performed with respect to two
alternative merit functions, and the Hessian approximation is updated
by a modified BFGS formula.

NLPQLP is a proprietary software, which can be obtained `here <http://www.ai7.uni-bayreuth.de/nlpqlp.htm>`_.
The latest version supported is v4.2.2.
NLPQLP is a proprietary software, which can be obtained `here <https://www.schittkowski.de/numericalsoftware.php>`_.
The supported versions are v4.2.2 and v5.0.3, but other versions may work.

Options
-------
Expand Down
30 changes: 30 additions & 0 deletions doc/optimizers/SNOPT_options.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,37 @@ Return work arrays:
These arrays can be used to hot start a subsequent optimization.
The SNOPT option 'Sticky parameters' will also be automatically set to 'Yes' to facilitate the hot start.
Work arrays save file:
desc: >
This option is unique to the Python wrapper.
The SNOPT work arrays will be pickled and saved to this file after each major iteration.
This file is useful if you want to restart an optimization that did not exit cleanly.
If None, the work arrays are not saved.
snSTOP function handle:
desc: >
This option is unique to the Python wrapper.
A function handle can be supplied which is called at the end of each major iteration.
The following is an example of a callback function that saves the restart dictionary
to a different file after each major iteration.
.. code-block:: python
def snstopCallback(iterDict, restartDict):
# Get the major iteration number
nMajor = iterDict["nMajor"]
# Save the restart dictionary
writePickle(f"restart_{nMajor}.pickle", restartDict)
return 0
snSTOP arguments:
desc: |
This option is unique to the Python wrapper.
It specifies a list of arguments that will be passed to the snSTOP function handle.
``iterDict`` is always passed as an argument.
Additional arguments are passed in the same order as this list.
The possible values are
- ``restartDict``
2 changes: 1 addition & 1 deletion paper/paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ tags:
- optimization
- Python
authors:
- name: Neil Wu
- name: Ella Wu
orcid: 0000-0001-8856-9661
affiliation: 1
- name: Gaetan Kenway
Expand Down
24 changes: 23 additions & 1 deletion pyoptsparse/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version__ = "2.10.1"
__version__ = "2.12.0"

from .pyOpt_history import History
from .pyOpt_variable import Variable
Expand All @@ -19,3 +19,25 @@
from .pyNSGA2.pyNSGA2 import NSGA2
from .pyALPSO.pyALPSO import ALPSO
from .pyParOpt.ParOpt import ParOpt

__all__ = [
"History",
"Variable",
"Gradient",
"Constraint",
"Objective",
"Optimization",
"Optimizer",
"OPT",
"Optimizers",
"Solution",
"SNOPT",
"IPOPT",
"SLSQP",
"CONMIN",
"PSQP",
"NLPQLP",
"NSGA2",
"ALPSO",
"ParOpt",
]
10 changes: 4 additions & 6 deletions pyoptsparse/pyALPSO/pyALPSO.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
pyALPSO - A pyOptSparse interface to ALPSO
work with sparse optimization problems.
"""

# Standard Python modules
import datetime
import time
Expand All @@ -10,6 +11,7 @@
import numpy as np

# Local modules
from . import alpso
from ..pyOpt_error import Error
from ..pyOpt_optimizer import Optimizer

Expand All @@ -25,9 +27,7 @@ class ALPSO(Optimizer):
- pll_type -> STR: ALPSO Parallel Implementation (None, SPM- Static, DPM- Dynamic, POA-Parallel Analysis), *Default* = None
"""

def __init__(self, raiseError=True, options={}):
from . import alpso

def __init__(self, options={}):
self.alpso = alpso

category = "Global Optimizer"
Expand Down Expand Up @@ -192,9 +192,7 @@ def objconfunc(x):
self.optProb.comm.bcast(-1, root=0)

# Store Results
sol_inform = {}
# sol_inform['value'] = inform
# sol_inform['text'] = self.informs[inform[0]]
sol_inform = {"value": "", "text": ""}

# Create the optimization solution
sol = self._createSolution(optTime, sol_inform, opt_f, opt_x)
Expand Down
20 changes: 9 additions & 11 deletions pyoptsparse/pyCONMIN/pyCONMIN.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,7 @@
pyCONMIN - A variation of the pyCONMIN wrapper specificially designed to
work with sparse optimization problems.
"""
# Compiled module
try:
from . import conmin # isort: skip
except ImportError:
conmin = None

# Standard Python modules
import datetime
import os
Expand All @@ -18,6 +14,11 @@
# Local modules
from ..pyOpt_error import Error
from ..pyOpt_optimizer import Optimizer
from ..pyOpt_utils import try_import_compiled_module_from_path

# import the compiled module
THIS_DIR = os.path.dirname(os.path.abspath(__file__))
conmin = try_import_compiled_module_from_path("conmin", THIS_DIR, raise_warning=True)


class CONMIN(Optimizer):
Expand All @@ -30,9 +31,8 @@ def __init__(self, raiseError=True, options={}):
category = "Local Optimizer"
defOpts = self._getDefaultOptions()
informs = self._getInforms()
if conmin is None:
if raiseError:
raise Error("There was an error importing the compiled conmin module")
if isinstance(conmin, str) and raiseError:
raise ImportError(conmin)

self.set_options = []
super().__init__(name, category, defaultOptions=defOpts, informs=informs, options=options)
Expand Down Expand Up @@ -241,9 +241,7 @@ def cnmngrad(n1, n2, x, f, g, ct, df, a, ic, nac):
self.optProb.comm.bcast(-1, root=0)

# Store Results
sol_inform = {}
# sol_inform['value'] = inform
# sol_inform['text'] = self.informs[inform[0]]
sol_inform = {"value": "", "text": ""}

# Create the optimization solution
sol = self._createSolution(optTime, sol_inform, ff, xs)
Expand Down
44 changes: 29 additions & 15 deletions pyoptsparse/pyIPOPT/pyIPOPT.py
Original file line number Diff line number Diff line change
@@ -1,24 +1,31 @@
"""
pyIPOPT - A python wrapper to the core IPOPT compiled module.
"""
# Compiled module
try:
from . import pyipoptcore # isort: skip
except ImportError:
pyipoptcore = None

# Standard Python modules
import copy
import datetime
import os
import time

# External modules
import numpy as np

# Local modules
from ..pyOpt_error import Error
from ..pyOpt_optimizer import Optimizer
from ..pyOpt_utils import ICOL, INFINITY, IROW, convertToCOO, extractRows, scaleRows
from ..pyOpt_utils import (
ICOL,
INFINITY,
IROW,
convertToCOO,
extractRows,
scaleRows,
try_import_compiled_module_from_path,
)

# import the compiled module
THIS_DIR = os.path.dirname(os.path.abspath(__file__))
pyipoptcore = try_import_compiled_module_from_path("pyipoptcore", THIS_DIR)


class IPOPT(Optimizer):
Expand All @@ -36,9 +43,8 @@ def __init__(self, raiseError=True, options={}):
defOpts = self._getDefaultOptions()
informs = self._getInforms()

if pyipoptcore is None:
if raiseError:
raise Error("There was an error importing the compiled IPOPT module")
if isinstance(pyipoptcore, str) and raiseError:
raise ImportError(pyipoptcore)

super().__init__(
name,
Expand Down Expand Up @@ -155,7 +161,7 @@ def __call__(

if len(optProb.constraints) == 0:
# If the user *actually* has an unconstrained problem,
# snopt sort of chokes with that....it has to have at
# IPOPT sort of chokes with that....it has to have at
# least one constraint. So we will add one
# automatically here:
self.unconstrained = True
Expand Down Expand Up @@ -211,19 +217,25 @@ def __call__(
# Define the 4 call back functions that ipopt needs:
def eval_f(x, user_data=None):
fobj, fail = self._masterFunc(x, ["fobj"])
if fail == 2:
if fail == 1:
fobj = np.array(np.NaN)
elif fail == 2:
self.userRequestedTermination = True
return fobj

def eval_g(x, user_data=None):
fcon, fail = self._masterFunc(x, ["fcon"])
if fail == 2:
if fail == 1:
fcon = np.array(np.NaN)
elif fail == 2:
self.userRequestedTermination = True
return fcon.copy()

def eval_grad_f(x, user_data=None):
gobj, fail = self._masterFunc(x, ["gobj"])
if fail == 2:
if fail == 1:
gobj = np.array(np.NaN)
elif fail == 2:
self.userRequestedTermination = True
return gobj.copy()

Expand All @@ -232,7 +244,9 @@ def eval_jac_g(x, flag, user_data=None):
return copy.deepcopy(matStruct)
else:
gcon, fail = self._masterFunc(x, ["gcon"])
if fail == 2:
if fail == 1:
gcon = np.array(np.NaN)
elif fail == 2:
self.userRequestedTermination = True
return gcon.copy()

Expand Down
Loading

0 comments on commit d82a685

Please sign in to comment.