Skip to content

Commit

Permalink
Merge branch 'master' into od/ma2
Browse files Browse the repository at this point in the history
  • Loading branch information
odow authored Jan 1, 2023
2 parents c16e70a + 0871cfb commit b94d17e
Show file tree
Hide file tree
Showing 40 changed files with 3,041 additions and 2,077 deletions.
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "JuMP"
uuid = "4076af6c-e467-56ae-b986-b466b2749572"
repo = "https://github.com/jump-dev/JuMP.jl.git"
version = "1.5.0"
version = "1.6.0"

[deps]
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Expand All @@ -12,7 +12,7 @@ Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"

[compat]
MathOptInterface = "1.10"
MathOptInterface = "1.11"
MutableArithmetics = "1.1"
OrderedCollections = "1"
julia = "1.6"
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ embedded in [Julia](https://julialang.org/). You can find out more about us by
visiting [jump.dev](https://jump.dev).


**Latest Release**: [![version](https://juliahub.com/docs/JuMP/DmXqY/1.5.0/version.svg)](https://juliahub.com/ui/Packages/JuMP/DmXqY/1.5.0) (`release-1.0` branch):
**Latest Release**: [![version](https://juliahub.com/docs/JuMP/DmXqY/1.6.0/version.svg)](https://juliahub.com/ui/Packages/JuMP/DmXqY/1.6.0) (`release-1.0` branch):
* Installation via the Julia package manager:
* `import Pkg; Pkg.add("JuMP")`
* Get help:
Expand Down
1 change: 1 addition & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,7 @@ if !_FAST
for file in [
joinpath("getting_started", "getting_started_with_julia.md"),
joinpath("getting_started", "getting_started_with_JuMP.md"),
joinpath("getting_started", "debugging.md"),
joinpath("linear", "tips_and_tricks.md"),
]
filename = joinpath(@__DIR__, "src", "tutorials", file)
Expand Down
23 changes: 21 additions & 2 deletions docs/src/changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,34 @@ CurrentModule = JuMP
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## Unreleased
## Version v1.6.0 (January 1, 2023)

### Added

- Added a `result` keyword argument to [`solution_summary`] to allow
summarizing models with multiple solutions (#3138)
- Added [`relax_with_penalty!`](@ref), which is a useful tool when debugging
infeasible models (#3140)
- Added [`has_start_value`](@ref) (#3157)
- Added support for [`HermitianPSDCone`](@ref) in constraints (#3154)

### Fixed

- Fixed promotion of complex expressions (#3150) (#3164)

### Other

- Added Benders tutorial with in-place resolves (#3145)
- Added more [Tips and tricks](@id linear_tips_and_tricks) for linear programs
(#3144)
(#3144) (#3163)
- Clarified documentation that `start` can depend on the indices of a
variable container (#3148)
- Replace instances of `length` and `size` by the recommended `eachindex` and
`axes` (#3149)
- Added a warning explaining why the model is dirty when accessing solution
results from a modified model (#3156)
- Clarify documentation that `PSD` ensures a symmetric matrix (#3159)
- Maintenance of the JuMP test suite (#3146) (#3158) (#3162)

## Version 1.5.0 (December 8, 2022)

Expand Down
24 changes: 24 additions & 0 deletions docs/src/manual/complex.md
Original file line number Diff line number Diff line change
Expand Up @@ -251,3 +251,27 @@ GenericAffExpr{ComplexF64, VariableRef}
julia> typeof(H[2, 1])
GenericAffExpr{ComplexF64, VariableRef}
```

## Hermitian PSD constraints

The [`HermitianPSDCone`](@ref) can also be used in the [`@constraint`](@ref)
macro:
```jldoctest
julia> model = Model();
julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
x[1]
x[2]
julia> import LinearAlgebra
julia> H = LinearAlgebra.Hermitian([x[1] 1im; -1im -x[2]])
2×2 LinearAlgebra.Hermitian{GenericAffExpr{ComplexF64, VariableRef}, Matrix{GenericAffExpr{ComplexF64, VariableRef}}}:
x[1] (0.0 + 1.0im)
(0.0 - 1.0im) (-1.0 - 0.0im) x[2]
julia> @constraint(model, H in HermitianPSDCone())
[x[1] (0.0 + 1.0im);
(0.0 - 1.0im) (-1.0 + 0.0im) x[2]] ∈ HermitianPSDCone()
```
45 changes: 25 additions & 20 deletions docs/src/manual/solutions.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,23 +35,25 @@ Subject to

## Solutions summary

[`solution_summary`](@ref) can be used for checking the summary of the optimization solutions.
[`solution_summary`](@ref) can be used for checking the summary of the
optimization solutions.

```jldoctest solutions; filter=r"[0-9]+\.[0-9]+e[\+\-][0-9]+"
julia> solution_summary(model)
* Solver : HiGHS
* Status
Result count : 1
Termination status : OPTIMAL
Primal status : FEASIBLE_POINT
Dual status : FEASIBLE_POINT
Message from the solver:
"kHighsModelStatusOptimal"
* Candidate solution
Objective value : -2.05143e+02
Objective bound : -0.00000e+00
Relative gap : Inf
* Candidate solution (result #1)
Primal status : FEASIBLE_POINT
Dual status : FEASIBLE_POINT
Objective value : -2.05143e+02
Objective bound : -0.00000e+00
Relative gap : Inf
Dual objective value : -2.05143e+02
* Work counters
Expand All @@ -60,22 +62,21 @@ julia> solution_summary(model)
Barrier iterations : 0
Node count : -1
julia> solution_summary(model, verbose=true)
julia> solution_summary(model; verbose = true)
* Solver : HiGHS
* Status
Termination status : OPTIMAL
Primal status : FEASIBLE_POINT
Dual status : FEASIBLE_POINT
Result count : 1
Has duals : true
Termination status : OPTIMAL
Message from the solver:
"kHighsModelStatusOptimal"
* Candidate solution
Objective value : -2.05143e+02
Objective bound : -0.00000e+00
Relative gap : Inf
* Candidate solution (result #1)
Primal status : FEASIBLE_POINT
Dual status : FEASIBLE_POINT
Objective value : -2.05143e+02
Objective bound : -0.00000e+00
Relative gap : Inf
Dual objective value : -2.05143e+02
Primal solution :
x : 1.54286e+01
Expand Down Expand Up @@ -304,7 +305,7 @@ If you are iteratively querying solution information and modifying a model,
query all the results first, then modify the problem.

For example, instead of:
```jldoctest
```jldoctest; filter = r"\@ JuMP.+/src/JuMP.jl"
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
Expand All @@ -319,6 +320,8 @@ OPTIMAL::TerminationStatusCode = 1
julia> set_upper_bound(x, 1)
julia> x_val = value(x)
┌ Warning: The model has been modified since the last call to `optimize!` (or `optimize!` has not been called yet). If you are iteratively querying solution information and modifying a model, query all the results first, then modify the model.
└ @ JuMP ~/work/JuMP.jl/JuMP.jl/src/JuMP.jl:1250
ERROR: OptimizeNotCalled()
Stacktrace:
[...]
Expand Down Expand Up @@ -533,9 +536,10 @@ end
Some solvers support returning multiple solutions. You can check how many
solutions are available to query using [`result_count`](@ref).

Functions for querying the solutions, for example, [`primal_status`](@ref) and
[`value`](@ref), all take an additional keyword argument `result` which can be
used to specify which result to return.
Functions for querying the solutions, for example, [`primal_status`](@ref),
[`dual_status`](@ref), [`value`](@ref), [`dual`](@ref), and [`solution_summary`](@ref)
all take an additional keyword argument `result` which can be used to specify
which result to return.

!!! warning
Even if [`termination_status`](@ref) is `OPTIMAL`, some of the returned
Expand All @@ -558,6 +562,7 @@ end
an_optimal_solution = value.(x; result = 1)
optimal_objective = objective_value(model; result = 1)
for i in 2:result_count(model)
print(solution_summary(model; result = i))
@assert has_values(model; result = i)
println("Solution $(i) = ", value.(x; result = i))
obj = objective_value(model; result = i)
Expand Down
15 changes: 9 additions & 6 deletions docs/src/manual/variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -944,26 +944,29 @@ model[:x]

## Semidefinite variables

A square symmetric matrix ``X`` is positive semidefinite if all eigenvalues are
nonnegative.
Declare a square matrix of JuMP variables to be positive semidefinite by passing
`PSD` as an optional positional argument:

Declare a matrix of JuMP variables to be positive semidefinite by passing `PSD`
as an optional positional argument:
```jldoctest; setup=:(model=Model())
julia> @variable(model, x[1:2, 1:2], PSD)
2×2 LinearAlgebra.Symmetric{VariableRef, Matrix{VariableRef}}:
x[1,1] x[1,2]
x[1,2] x[2,2]
```

This will ensure that `x` is symmetric, and that all of its eigenvalues are
nonnegative.

!!! note
`x` must be a square 2-dimensional `Array` of JuMP variables; it cannot be a
DenseAxisArray or a SparseAxisArray.

## Symmetric variables

Declare a square matrix of JuMP variables to be symmetric by passing
`Symmetric` as an optional positional argument:
Declare a square matrix of JuMP variables to be symmetric (but not necessarily
positive semidefinite) by passing `Symmetric` as an optional positional
argument:

```jldoctest; setup=:(model=Model())
julia> @variable(model, x[1:2, 1:2], Symmetric)
2×2 LinearAlgebra.Symmetric{VariableRef, Matrix{VariableRef}}:
Expand Down
1 change: 1 addition & 0 deletions docs/src/reference/constraints.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ normalized_rhs
set_normalized_rhs
add_to_function_constant
relax_with_penalty!
```

## Deletion
Expand Down
1 change: 1 addition & 0 deletions docs/src/reference/variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ variable_by_name
## Start values

```@docs
has_start_value
set_start_value
start_value
```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ function get_data()
6.6 49
5.1 42
]
return Data([Piece(data[i, 1], data[i, 2]) for i in 1:size(data, 1)], 100.0)
return Data([Piece(data[i, 1], data[i, 2]) for i in axes(data, 1)], 100.0)
end

data = get_data()
Expand Down
53 changes: 49 additions & 4 deletions docs/src/tutorials/getting_started/debugging.jl
Original file line number Diff line number Diff line change
Expand Up @@ -164,11 +164,11 @@ import HiGHS

# A simple example of an infeasible model is:

model = Model(HiGHS.Optimizer)
model = Model(HiGHS.Optimizer);
set_silent(model)
@variable(model, x >= 0)
@objective(model, Max, 2x + 1)
@constraint(model, 2x - 1 <= -2)
@constraint(model, con, 2x - 1 <= -2)

# because the bound says that `x >= 0`, but we can rewrite the constraint to be
# `x <= -1/2`. When the problem is infeasible, JuMP may return one of a number
Expand Down Expand Up @@ -230,6 +230,51 @@ termination_status(model)
# solvers such as Gurobi and CPLEX do. If the solver does support computing
# conflicts, read [Conflicts](@ref) for more details.

# ### Penalty relaxation

# Another strategy to debug sources of infeasibility is the
# [`relax_with_penalty!`](@ref) function.
#
# The penalty relaxation modifies constraints of the form ``f(x) \in S`` into
# ``f(x) + y - z \in S``, where ``y, z \ge 0``, and then it introduces a
# penalty term into the objective of ``a \times (y + z)`` (if minimizing, else
# ``-a``), where ``a`` is a penalty.

map = relax_with_penalty!(model)

# Here `map` is a dictionary which maps constraint indices to an affine
# expression representing ``(y + z)``.

# If we optimize the relaxed model, this time we get a feasible solution:

optimize!(model)
termination_status(model)

# Iterate over the contents of `map` to see which constraints are violated:

for (con, penalty) in map
violation = value(penalty)
if violation > 0
println("Constraint `$(name(con))` is violated by $violation")
end
end

# Once you find a violated constraint in the relaxed problem, take a look to see
# if there is a typo or other common mistake in that particular constraint.

# Consult the docstring [`relax_with_penalty!`](@ref) for information on how to
# modify the penalty cost term `a`, either for every constraint in the model or
# a particular subset of the constraints.

# When using [`relax_with_penalty!`](@ref), you should be aware that:
#
# * Variable bounds and integrality restrictions are not relaxed. If the
# problem is still infeasible after calling [`relax_with_penalty!`](@ref),
# check the variable bounds.
# * You cannot undo the penalty relaxation. If you need an unmodified model,
# rebuild the problem, or call [`copy_model`](@ref) before calling
# [`relax_with_penalty!`](@ref).

# ## Debugging an unbounded model

# A model is unbounded if there is no limit on how good the objective value can
Expand All @@ -241,7 +286,7 @@ termination_status(model)

# A simple example of an unbounded model is:

model = Model(HiGHS.Optimizer)
model = Model(HiGHS.Optimizer);
set_silent(model)
@variable(model, x >= 0)
@objective(model, Max, 2x + 1)
Expand Down Expand Up @@ -288,7 +333,7 @@ termination_status(model)
# the variable must be less-than or equal to the expression of the objective
# function. For example:

model = Model(HiGHS.Optimizer)
model = Model(HiGHS.Optimizer);
set_silent(model)
@variable(model, x >= 0)
## @objective(model, Max, 2x + 1)
Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/linear/tips_and_tricks.jl
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ model = Model();

# This reformulation does not work for ``t \ge \min\{x, y\}``.

# ## Max
# ## Min

# To model ``t \le \min\{x, y\}``, do:

Expand Down
17 changes: 15 additions & 2 deletions src/JuMP.jl
Original file line number Diff line number Diff line change
Expand Up @@ -1055,9 +1055,10 @@ Base.ndims(::AbstractJuMPScalar) = 0

# These are required to create symmetric containers of AbstractJuMPScalars.
LinearAlgebra.symmetric_type(::Type{T}) where {T<:AbstractJuMPScalar} = T
LinearAlgebra.hermitian_type(::Type{T}) where {T<:AbstractJuMPScalar} = T
LinearAlgebra.symmetric(scalar::AbstractJuMPScalar, ::Symbol) = scalar
# This is required for linear algebra operations involving transposes.
LinearAlgebra.adjoint(scalar::AbstractJuMPScalar) = scalar
LinearAlgebra.hermitian(scalar::AbstractJuMPScalar, ::Symbol) = adjoint(scalar)
LinearAlgebra.adjoint(scalar::AbstractJuMPScalar) = conj(scalar)

"""
owner_model(s::AbstractJuMPScalar)
Expand Down Expand Up @@ -1246,6 +1247,12 @@ function MOI.get(
if !MOI.is_set_by_optimize(attr)
return MOI.get(backend(model), attr, index(v))
elseif model.is_model_dirty && mode(model) != DIRECT
@warn(
"The model has been modified since the last call to `optimize!` (" *
"or `optimize!` has not been called yet). If you are iteratively " *
"querying solution information and modifying a model, query all " *
"the results first, then modify the model.",
)
throw(OptimizeNotCalled())
end
return _moi_get_result(backend(model), attr, index(v))
Expand All @@ -1260,6 +1267,12 @@ function MOI.get(
if !MOI.is_set_by_optimize(attr)
return MOI.get(backend(model), attr, index(cr))
elseif model.is_model_dirty && mode(model) != DIRECT
@warn(
"The model has been modified since the last call to `optimize!` (" *
"or `optimize!` has not been called yet). If you are iteratively " *
"querying solution information and modifying a model, query all " *
"the results first, then modify the model.",
)
throw(OptimizeNotCalled())
end
return _moi_get_result(backend(model), attr, index(cr))
Expand Down
Loading

0 comments on commit b94d17e

Please sign in to comment.