Skip to content

Commit

Permalink
Merge branch 'main' into parOptInterface
Browse files Browse the repository at this point in the history
  • Loading branch information
A-CGray authored Oct 2, 2024
2 parents 236b957 + bc021e4 commit 9011d96
Show file tree
Hide file tree
Showing 11 changed files with 244 additions and 46 deletions.
1 change: 1 addition & 0 deletions .github/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ dependencies:
- pip
- setuptools
- build
- packaging
# testing
- parameterized
- testflo
Expand Down
31 changes: 22 additions & 9 deletions doc/guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@ The optimization class is created using the following call:

.. code-block:: python
optProb = Optimization("name", objFun)
optProb = Optimization("name", objconFun)
The general template of the objective function is as follows:
The general template of the objective and constraint function is as follows:

.. code-block:: python
def obj_fun(xdict):
def objconFun(xdict):
funcs = {}
funcs["obj_name"] = function(xdict)
funcs["con_name"] = function(xdict)
Expand Down Expand Up @@ -196,17 +196,30 @@ This argument is a dictionary, and the keys must match the design variable sets
Essentially what we have done is specified the which blocks of the constraint rows are non-zero,
and provided the sparsity structure of ones that are sparse.

For linear constraints the values in ``jac`` are meaningful:
they must be the actual linear constraint Jacobian values (which do not change).
For non-linear constraints, only the sparsity structure (i.e. which entries are nonzero) is significant.
The values themselves will be determined by a call to the ``sens()`` function.

Also note, that the ``wrt`` and ``jac`` keyword arguments are only supported when user-supplied sensitivity is used.
Note that the ``wrt`` and ``jac`` keyword arguments are only supported when user-supplied sensitivity is used.
If automatic gradients from pyOptSparse are used, the constraint Jacobian will necessarily be dense.

.. note::
Currently, only the optimizers SNOPT and IPOPT support sparse Jacobians.

Linear Constraints
~~~~~~~~~~~~~~~~~~
Linear constraints in pyOptSparse are defined exclusively by ``jac``, ``lower``, and ``upper`` entries of the ``addConGroup`` method.
For linear constraint :math:`g_L \leq Ax + b \leq g_U`, the constraint definition would look like:

.. code-block:: python
optProb.addConGroup("con", num_cons, linear=True, wrt=["xvars"], jac={"xvars": A}, lower=gL - b, upper=gU - b)
Users should not provide the linear constraint values (i.e., :math:`g = Ax + b`) in a user-defined objective/constraint function.
pyOptSparse will raise an error if you do so.

For linear constraints, the values in ``jac`` are meaningful:
they must be the actual linear constraint Jacobian values (which do not change).
For non-linear constraints, only the sparsity structure (i.e. which entries are nonzero) is significant.
The values themselves will be determined by a call to the ``sens()`` function.


Objectives
++++++++++

Expand Down
30 changes: 30 additions & 0 deletions doc/optimizers/SNOPT_options.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,37 @@ Return work arrays:
These arrays can be used to hot start a subsequent optimization.
The SNOPT option 'Sticky parameters' will also be automatically set to 'Yes' to facilitate the hot start.
Work arrays save file:
desc: >
This option is unique to the Python wrapper.
The SNOPT work arrays will be pickled and saved to this file after each major iteration.
This file is useful if you want to restart an optimization that did not exit cleanly.
If None, the work arrays are not saved.
snSTOP function handle:
desc: >
This option is unique to the Python wrapper.
A function handle can be supplied which is called at the end of each major iteration.
The following is an example of a callback function that saves the restart dictionary
to a different file after each major iteration.
.. code-block:: python
def snstopCallback(iterDict, restartDict):
# Get the major iteration number
nMajor = iterDict["nMajor"]
# Save the restart dictionary
writePickle(f"restart_{nMajor}.pickle", restartDict)
return 0
snSTOP arguments:
desc: |
This option is unique to the Python wrapper.
It specifies a list of arguments that will be passed to the snSTOP function handle.
``iterDict`` is always passed as an argument.
Additional arguments are passed in the same order as this list.
The possible values are
- ``restartDict``
2 changes: 1 addition & 1 deletion pyoptsparse/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version__ = "2.11.3"
__version__ = "2.12.0"

from .pyOpt_history import History
from .pyOpt_variable import Variable
Expand Down
20 changes: 20 additions & 0 deletions pyoptsparse/pyOpt_optimizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -367,6 +367,10 @@ def _masterFunc2(self, x, evaluate, writeHist=True):
self.userObjTime += time.time() - timeA
self.userObjCalls += 1

# Make sure the user-defined function does *not* return linear constraint values
if self.callCounter == 0:
self._checkLinearConstraints(funcs)

# Discard zero imaginary components in funcs
for key, val in funcs.items():
funcs[key] = np.real(val)
Expand Down Expand Up @@ -417,6 +421,10 @@ def _masterFunc2(self, x, evaluate, writeHist=True):
self.userObjTime += time.time() - timeA
self.userObjCalls += 1

# Make sure the user-defined function does *not* return linear constraint values
if self.callCounter == 0:
self._checkLinearConstraints(funcs)

# Discard zero imaginary components in funcs
for key, val in funcs.items():
funcs[key] = np.real(val)
Expand Down Expand Up @@ -867,6 +875,18 @@ def _on_setOption(self, name, value):
"""
pass

def _checkLinearConstraints(self, funcs):
"""
Makes sure that the user-defined obj/con function does not compute the linear constraint values
because the linear constraints are exclusively defined by jac and bounds in addConGroup.
"""
for conName in self.optProb.constraints:
if self.optProb.constraints[conName].linear and conName in funcs:
raise Error(
"Value for linear constraint returned from user obj function. Linear constraints "
+ "are evaluated internally and should not be returned from the user's function."
)

def setOption(self, name, value=None):
"""
Generic routine for all option setting. The routine does
Expand Down
37 changes: 34 additions & 3 deletions pyoptsparse/pySNOPT/pySNOPT.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,10 @@
from typing import Any, Dict, Optional, Tuple

# External modules
from baseclasses.utils import CaseInsensitiveSet
from baseclasses.utils import CaseInsensitiveSet, writePickle
import numpy as np
from numpy import ndarray
from pkg_resources import parse_version
from packaging.version import parse as parse_version

# Local modules
from ..pyOpt_error import Error
Expand Down Expand Up @@ -60,7 +60,9 @@ def __init__(self, raiseError=True, options: Dict = {}):
{
"Save major iteration variables",
"Return work arrays",
"Work arrays save file",
"snSTOP function handle",
"snSTOP arguments",
}
)

Expand Down Expand Up @@ -118,7 +120,9 @@ def _getDefaultOptions() -> Dict[str, Any]:
"Total real workspace": [int, None],
"Save major iteration variables": [list, []],
"Return work arrays": [bool, False],
"Work arrays save file": [(type(None), str), None],
"snSTOP function handle": [(type(None), type(lambda: None)), None],
"snSTOP arguments": [list, []],
}
return defOpts

Expand Down Expand Up @@ -667,12 +671,39 @@ def _snstop(self, ktcond, mjrprtlvl, minimize, n, nncon, nnobj, ns, itn, nmajor,
if "funcs" in self.cache.keys():
iterDict["funcs"].update(self.cache["funcs"])

# Create the restart dictionary to be passed to snstop_handle
restartDict = {
"cw": cw,
"iw": iw,
"rw": rw,
"xs": x, # x is the same as xs; we call it x here to be consistent with the SNOPT subroutine snSTOP
"hs": hs,
"pi": pi,
}

workArraysSave = self.getOption("Work arrays save file")
if workArraysSave is not None:
# Save the restart dictionary
writePickle(workArraysSave, restartDict)

# perform callback if requested
snstop_handle = self.getOption("snSTOP function handle")
if snstop_handle is not None:

# Get the arguments to pass in to snstop_handle
# iterDict is always included
snstopArgs = [iterDict]
for snstopArg in self.getOption("snSTOP arguments"):
if snstopArg == "restartDict":
snstopArgs.append(restartDict)
else:
raise Error(f"Received unknown snSTOP argument {snstopArg}. "
+ "Please see 'snSTOP arguments' option in the pyOptSparse documentation "
+ "under 'SNOPT'.")

if not self.storeHistory:
raise Error("snSTOP function handle must be used with storeHistory=True")
iabort = snstop_handle(iterDict)
iabort = snstop_handle(*snstopArgs)
# write iterDict again if anything was inserted
if self.storeHistory and callCounter is not None:
self.hist.write(callCounter, iterDict)
Expand Down
1 change: 1 addition & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,7 @@ def copy_shared_libraries():
platforms=["Linux"],
keywords="optimization",
install_requires=[
"packaging",
"sqlitedict>=1.6",
"numpy>=1.21,<2",
"scipy>=1.7",
Expand Down
86 changes: 86 additions & 0 deletions tests/test_hs015.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
"""Test solution of problem HS15 from the Hock & Schittkowski collection"""

# Standard Python modules
import os
import unittest

# External modules
from baseclasses.utils import readPickle, writePickle
import numpy as np
from parameterized import parameterized

Expand Down Expand Up @@ -193,6 +195,90 @@ def test_snopt_snstop(self):
# we should get 70/74
self.assert_inform_equal(sol, optInform=74)

def test_snopt_snstop_restart(self):
pickleFile = "restart.pickle"

def my_snstop_restart(iterDict, restartDict):
# Save the restart dictionary
writePickle(pickleFile, restartDict)

# Exit after 5 major iterations
if iterDict["nMajor"] == 5:
return 1

return 0

# Run the optimization for 5 major iterations
self.optName = "SNOPT"
self.setup_optProb()
optOptions = {
"snSTOP function handle": my_snstop_restart,
"snSTOP arguments": ["restartDict"],
}
sol = self.optimize(optOptions=optOptions, storeHistory=True)

# Check that the optimization exited with 74
self.assert_inform_equal(sol, optInform=74)

# Read the restart dictionary pickle file saved by snstop
restartDict = readPickle(pickleFile)

# Now optimize again but using the restart dictionary
self.setup_optProb()
opt = OPT(
self.optName,
options={
"Start": "Hot",
"Verify level": -1,
"snSTOP function handle": my_snstop_restart,
"snSTOP arguments": ["restartDict"],
},
)
histFile = "restart.hst"
sol = opt(self.optProb, sens=self.sens, storeHistory=histFile, restartDict=restartDict)

# Check that the optimization converged in fewer than 5 more major iterations
self.assert_solution_allclose(sol, 1e-12)
self.assert_inform_equal(sol, optInform=1)

# Delete the pickle and history files
os.remove(pickleFile)
os.remove(histFile)

def test_snopt_work_arrays_save(self):
# Run the optimization for 5 major iterations
self.optName = "SNOPT"
self.setup_optProb()
pickleFile = "work_arrays_save.pickle"
optOptions = {
"snSTOP function handle": self.my_snstop,
"Work arrays save file": pickleFile,
}
sol = self.optimize(optOptions=optOptions, storeHistory=True)

# Read the restart dictionary pickle file saved by snstop
restartDict = readPickle(pickleFile)

# Now optimize again but using the restart dictionary
self.setup_optProb()
opt = OPT(
self.optName,
options={
"Start": "Hot",
"Verify level": -1,
},
)
histFile = "work_arrays_save.hst"
sol = opt(self.optProb, sens=self.sens, storeHistory=histFile, restartDict=restartDict)

# Check that the optimization converged
self.assert_solution_allclose(sol, 1e-12)
self.assert_inform_equal(sol, optInform=1)

# Delete the pickle and history files
os.remove(pickleFile)
os.remove(histFile)

def test_snopt_failed_initial(self):
def failed_fun(x_dict):
funcs = {"obj": 0.0, "con": [np.nan, np.nan]}
Expand Down
47 changes: 47 additions & 0 deletions tests/test_lincon_error.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
"""
Tests that pyOptSparse raises an error when a user-defined obj/con function returns a linear constraint value
(which should not because linear constraint is defined exclusively by jac and bounds)
"""

# Standard Python modules
import unittest

# First party modules
from pyoptsparse import SLSQP, Optimization
from pyoptsparse.pyOpt_error import Error


def objfunc(xdict):
"""Evaluates the equation f(x,y) = (x-3)^2 + xy + (y+4)^2 - 3"""
x = xdict["x"]
funcs = {}

funcs["obj"] = x**2
funcs["con"] = x - 1 # falsely return a linear constraint value

fail = False
return funcs, fail


class TestLinearConstraintCheck(unittest.TestCase):
def test(self):
# define an optimization problem with a linear constraint
optProb = Optimization("test", objfunc)
optProb.addVarGroup("x", 1, value=1)
optProb.addObj("obj")
optProb.addConGroup("con", 1, lower=1.0, linear=True, wrt=["x"], jac={"x": [1.0]})

opt = SLSQP()
with self.assertRaises(Error) as context:
opt(optProb, sens="FD")

# check if we get the expected error message
err_msg = (
"Value for linear constraint returned from user obj function. Linear constraints "
+ "are evaluated internally and should not be returned from the user's function."
)
self.assertEqual(err_msg, str(context.exception))


if __name__ == "__main__":
unittest.main()
Loading

0 comments on commit 9011d96

Please sign in to comment.