You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With left preconditioner Pl, the recorded residual (by setting log=true) is computed on the preconditioned equation norm(Pl*b - Pl*A*x) = norm(Pl*r), not the original equation norm(b - A*x) = norm(r). The two norms can differ by magnitudes for certain Pl. No such an issue for right preconditioner Pr, because norm(b - A*Pr*z) = norm(b - A*x) = norm(r).
I understand that left-preconditioned Krylov method uses preconditioned residual as termination criteria (abstol and reltol). However, a common need is to quantitatively compare left- and right-preconditioners, but here the two norms are not directly comparable. Also, there is no way to reconstruct norm(r) from norm(Pl*r) (just a scalar, not residual vector).
In SciPy, users can define custom callback to record different types of residuals. I can't find similar features in IterativeSolvers.jl, and wonder about the proper way to record unpreconditioned residual norms.
Minimum example
using SparseArrays
using IterativeSolvers
import Preconditioners: DiagonalPreconditioner
import LinearAlgebra: norm
import Random: seed!
n =24seed!(0)
A =sprand(n, n, 0.1) *0.2+spdiagm(range(1, length=n))
b =rand(n);
P =DiagonalPreconditioner(A)
shared_kwargs =Dict(
:maxiter=>3,
:abstol=>1e-9,
:reltol=>1e-9,
:log=>true,
)
# solver = bicgstabl # `bicgstabl` only allows Pl, not Pr
solver = gmres
x_raw, history_raw =solver(A, b; shared_kwargs...)
x_diag_r, history_diag_r =solver(A, b, Pr=P; shared_kwargs...)
x_diag_l, history_diag_l =solver(A, b, Pl=P; shared_kwargs...)
isapprox(norm(b - A * x_raw), history_raw[:resnorm][end]) # trueisapprox(norm(b - A * x_diag_r), history_diag_r[:resnorm][end]) # trueisapprox(norm(b - A * x_diag_l), history_diag_l[:resnorm][end]) # false, not recording un-preconditioned residualisapprox(norm(P \ b - P \ (A * x_diag_l)), history_diag_l[:resnorm][end]) # true, recording preconditioned residual
SciPy reference
importnumpyasnpimportnumpy.linalgaslaimportscipy.sparseasspimportscipy.sparse.linalgassplan=24np.random.seed(0)
A=sp.random(n, n, density=0.1) *0.2+sp.diags(np.arange(1, n+1))
b=np.random.rand(n)
P=sp.diags(1.0/A.diagonal())
PA=spla.LinearOperator(A.shape, lambdax: P @ (A @ x)) # or P @ A for explicit matrix formPb=P @ bhistory= []
defcallback(xk):
r_norm=la.norm(b-A.dot(xk)) # using unpreconditioned matrixhistory.append(r_norm)
x, _=spla.gmres(PA, Pb, callback=callback, callback_type='x', maxiter=3)
# x, _ = spla.bicgstab(PA, Pb, callback=callback, maxiter=3) # also worksla.norm(b-A @ x) ==history[-1] # true, residual is defined on original equationla.norm(Pb-PA @ x) ==history[-1] # false, residual not defined on preconditioned equation
Problem description
With left preconditioner
Pl
, the recorded residual (by settinglog=true
) is computed on the preconditioned equationnorm(Pl*b - Pl*A*x) = norm(Pl*r)
, not the original equationnorm(b - A*x) = norm(r)
. The two norms can differ by magnitudes for certainPl
. No such an issue for right preconditionerPr
, becausenorm(b - A*Pr*z) = norm(b - A*x) = norm(r)
.I understand that left-preconditioned Krylov method uses preconditioned residual as termination criteria (
abstol
andreltol
). However, a common need is to quantitatively compare left- and right-preconditioners, but here the two norms are not directly comparable. Also, there is no way to reconstructnorm(r)
fromnorm(Pl*r)
(just a scalar, not residual vector).In SciPy, users can define custom
callback
to record different types of residuals. I can't find similar features in IterativeSolvers.jl, and wonder about the proper way to record unpreconditioned residual norms.Minimum example
SciPy reference
Package version
The text was updated successfully, but these errors were encountered: