Skip to content

Commit

Permalink
Update ROLObjective to be dualspace-aware (#118)
Browse files Browse the repository at this point in the history
* Move Riesz representation responsibility to ReducedFunctional

ROLObjective.gradient was calling ReducedFunctional.derivative,
then performing its own Riesz mapping according to the ROLVector's
inner product. With the recent dualspace update, this doubles up the
conversion and causes a type error. Instead, we just defer the type
conversion responsibility to ReducedFunctional.derivative.

* Update computation of ROLObjective hessVec
  • Loading branch information
angus-g authored Sep 27, 2023
1 parent 78b3bd6 commit d532d41
Showing 1 changed file with 6 additions and 4 deletions.
10 changes: 6 additions & 4 deletions pyadjoint/optimization/rol_solver.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,12 +20,14 @@ def value(self, x, tol):
return self._val

def gradient(self, g, x, tol):
self.deriv = self.rf.derivative()
g.dat = g.riesz_map(self.deriv)
opts = {"riesz_representation": x.inner_product}
self.deriv = self.rf.derivative(options=opts)
g.dat = Enlist(self.deriv)

def hessVec(self, hv, v, x, tol):
hessian_action = self.rf.hessian(v.dat)
hv.dat = hv.riesz_map(hessian_action)
opts = {"riesz_representation": x.inner_product}
hessian_action = self.rf.hessian(v.dat, options=opts)
hv.dat = Enlist(hessian_action)

def update(self, x, flag, iteration):
if hasattr(ROL, "UpdateType") and isinstance(flag, ROL.UpdateType):
Expand Down

0 comments on commit d532d41

Please sign in to comment.