generated from jso-docs/tutorial-template
-
Notifications
You must be signed in to change notification settings - Fork 0
/
index.jl
128 lines (96 loc) · 5.06 KB
/
index.jl
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
#=
In this tutorial you will learn how to use JSO-compliant solvers to solve a PDE-constrained optimization problem discretized with [PDENLPModels.jl](https://github.com/JuliaSmoothOptimizers/PDENLPModels.jl).
\toc
## Problem Statement
In this first part, we define a distributed Poisson control problem with Dirichlet boundary conditions which is then automatically discretized.
We refer to [Gridap.jl](https://github.com/gridap/Gridap.jl) for more details on modeling PDEs and [PDENLPModels.jl](https://github.com/JuliaSmoothOptimizers/PDENLPModels.jl) for PDE-constrained optimization problems.
Let $\Omega = (-1,1)^2$, we solve the following problem:
$$
\begin{aligned}
\min_{y \in H^1_0, u \in H^1} \quad & \frac{1}{2} \int_\Omega |y(x) - y_d(x)|^2dx + \frac{\alpha}{2} \int_\Omega |u|^2dx \\
\text{s.t.} & -\Delta y = h + u, \quad x \in \Omega, \\
& y = 0, \quad x \in \partial \Omega,
\end{aligned}
$$
where $y_d(x) = -x_1^2$ and $\alpha = 10^{-2}$.
The force term is $h(x_1, x_2) = - sin(\omega x_1)sin(\omega x_2)$ with $\omega = \pi - \frac{1}{8}$.
=#
using Gridap, PDENLPModels
# First, we define the domain and its discretization.
n = 100
domain = (-1, 1, -1, 1)
partition = (n, n)
model = CartesianDiscreteModel(domain, partition)
# Then, we introduce the definition of the finite element spaces.
reffe = ReferenceFE(lagrangian, Float64, 1)
Xpde = TestFESpace(model, reffe; conformity = :H1, dirichlet_tags = "boundary")
y0(x) = 0.0
Ypde = TrialFESpace(Xpde, y0)
reffe_con = ReferenceFE(lagrangian, Float64, 1)
Xcon = TestFESpace(model, reffe_con; conformity = :H1)
Ycon = TrialFESpace(Xcon)
Y = MultiFieldFESpace([Ypde, Ycon])
# Gridap also requires setting the integration machinery use to define next the objective function and the constraint operator.
trian = Triangulation(model)
degree = 1
dΩ = Measure(trian, degree)
yd(x) = -x[1]^2
α = 1e-2
function f(y, u)
∫(0.5 * 1e2 * (yd - y) * (yd - y) + 0.5 * α * u * u) * dΩ
end
ω = π - 1 / 8
h(x) = -sin(ω * x[1]) * sin(ω * x[2])
function res(y, u, v)
∫(∇(v) ⊙ ∇(y) - v * u - v * h) * dΩ
end
op = FEOperator(res, Y, Xpde)
npde = Gridap.FESpaces.num_free_dofs(Ypde)
ncon = Gridap.FESpaces.num_free_dofs(Ycon)
x0 = zeros(npde + ncon);
# Overall, we built a GridapPDENLPModel, which implements the [NLPModel](https://juliasmoothoptimizers.github.io/NLPModels.jl/stable/) API.
nlp = GridapPDENLPModel(x0, f, trian, Ypde, Ycon, Xpde, Xcon, op, name = "Control elastic membrane")
using NLPModels
(get_nvar(nlp), get_ncon(nlp))
#=
## Find a Feasible Point
Before solving the previously defined model, we will first improve our initial guess.
The first step is to create a nonlinear least-squares whose residual is the equality-constraint of the optimization problem.
We use `FeasibilityResidual` from [NLPModelsModifiers.jl](https://github.com/JuliaSmoothOptimizers/NLPModelsModifiers.jl) to convert the NLPModel as an NLSModel.
Then, using `trunk`, a matrix-free solver for least-squares problems implemented in [JSOSolvers.jl](https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl), we find an
improved guess which is close to being feasible for our large-scale problem.
By default, JSO-compliant solvers use `nlp.meta.x0` as an initial guess.
=#
using JSOSolvers, NLPModelsModifiers
nls = FeasibilityResidual(nlp)
stats_trunk = trunk(nls)
# We check the solution from the stats returned by `trunk`:
norm(cons(nlp, stats_trunk.solution))
# We will use the solution found to initialize our solvers.
# ## Solve the Problem
# Finally, we are ready to solve the PDE-constrained optimization problem with a targeted tolerance of `10⁻⁵`.
# In the following, we will use both Ipopt and DCI on our problem.
# We refer to the tutorial [How to solve a small optimization problem with Ipopt + NLPModels](https://jso-docs.github.io/solve-an-optimization-problem-with-ipopt/)
# for more information on `NLPModelsIpopt`.
using NLPModelsIpopt
# Set `print_level = 0` to avoid printing detailed iteration information.
stats_ipopt = ipopt(nlp, x0 = stats_trunk.solution, tol = 1e-5, print_level = 0)
# The problem was successfully solved, and we can extract the function evaluations from the stats.
nlp.counters
# Reinitialize the counters before re-solving.
reset!(nlp);
# Most JSO-compliant solvers are using logger for printing iteration information.
# `NullLogger` avoids printing iteration information.
using DCISolver, Logging
stats_dci = with_logger(NullLogger()) do
dci(nlp, stats_trunk.solution, atol = 1e-5, rtol = 0.0)
end
# The problem was successfully solved, and we can extract the function evaluations from the stats.
nlp.counters
# We now compare the two solvers with respect to the time spent,
stats_ipopt.elapsed_time, stats_dci.elapsed_time
# and also check objective value, feasibility, and dual feasibility of `ipopt` and `dci`.
(stats_ipopt.objective, stats_ipopt.primal_feas, stats_ipopt.dual_feas),
(stats_dci.objective, stats_dci.primal_feas, stats_dci.dual_feas)
# Overall `DCISolver` is doing great for solving large-scale optimization problems!
# You can try increase the problem size by changing the discretization parameter `n`.