Problem with optimization with Optim.jl #25
-
Hi, I'm trying to use Optim.jl in Optimization of a State-to-State Transfer in a Two-Level-System. opt_result_OptimLBFGS = @optimize_or_load(
datadir("TLS", "opt_result_OptimLBFGS.jld2"),
problem,
method = :grape,
info_hook = chain_infohooks(
GRAPELinesearchAnalysis.plot_linesearch(datadir("TLS", "Linesearch", "OptimLBFGS")),
QuantumControl.GRAPE.print_table,
),
optimizer = Optim.LBFGS(;
alphaguess=LineSearches.InitialStatic(alpha=0.2),
linesearch=LineSearches.HagerZhang(alphamax=2.0)
)
); I receive this error:
Would you please help me in this regard? @goerz |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 4 replies
-
I'll have a look when I have a chance (which might take some time). In the meantime, I recommend sticking to the default L-BFGS-B optimizer. |
Beta Was this translation helpful? Give feedback.
-
I just find out that in GRAPELinesearchAnalysis.jl there is _get_linesearch_data function which needs optimizer_state.g_previous. But when I use Optim.jl for optimization, and select Optim.GradientDescent, it doesn't have any g_previous. instead it has x_previous.
|
Beta Was this translation helpful? Give feedback.
-
What you're seeing with Since there's a major release of all packages in the pipeline, it might take a couple of weeks before this becomes available. In the meantime, you could apply the patch to your local installation. |
Beta Was this translation helpful? Give feedback.
What you're seeing with
Optimization failed to converge
is a bug; see JuliaQuantumControl/GRAPE.jl#38. This will be fixed in the next release.Since there's a major release of all packages in the pipeline, it might take a couple of weeks before this becomes available. In the meantime, you could apply the patch to your local installation.