Skip to content

Commit 6b42abc

Browse files
authored
Fix nested bullet list in optimization.md
The mismatched indents were breaking the markdown list and having parsers read some sub-lists as code blocks.
1 parent edff6e7 commit 6b42abc

File tree

1 file changed

+19
-19
lines changed

1 file changed

+19
-19
lines changed

docs/src/optimization_packages/optimization.md

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -4,28 +4,28 @@ There are some solvers that are available in the Optimization.jl package directl
44

55
## Methods
66

7-
- `LBFGS`: The popular quasi-Newton method that leverages limited memory BFGS approximation of the inverse of the Hessian. Through a wrapper over the [L-BFGS-B](https://users.iems.northwestern.edu/%7Enocedal/lbfgsb.html) fortran routine accessed from the [LBFGSB.jl](https://github.com/Gnimuc/LBFGSB.jl/) package. It directly supports box-constraints.
7+
- `LBFGS`: The popular quasi-Newton method that leverages limited memory BFGS approximation of the inverse of the Hessian. Through a wrapper over the [L-BFGS-B](https://users.iems.northwestern.edu/%7Enocedal/lbfgsb.html) fortran routine accessed from the [LBFGSB.jl](https://github.com/Gnimuc/LBFGSB.jl/) package. It directly supports box-constraints.
88

9-
This can also handle arbitrary non-linear constraints through a Augmented Lagrangian method with bounds constraints described in 17.4 of Numerical Optimization by Nocedal and Wright. Thus serving as a general-purpose nonlinear optimization solver available directly in Optimization.jl.
9+
This can also handle arbitrary non-linear constraints through a Augmented Lagrangian method with bounds constraints described in 17.4 of Numerical Optimization by Nocedal and Wright. Thus serving as a general-purpose nonlinear optimization solver available directly in Optimization.jl.
1010

11-
- `Sophia`: Based on the recent paper https://arxiv.org/abs/2305.14342. It incorporates second order information in the form of the diagonal of the Hessian matrix hence avoiding the need to compute the complete hessian. It has been shown to converge faster than other first order methods such as Adam and SGD.
11+
- `Sophia`: Based on the recent paper https://arxiv.org/abs/2305.14342. It incorporates second order information in the form of the diagonal of the Hessian matrix hence avoiding the need to compute the complete hessian. It has been shown to converge faster than other first order methods such as Adam and SGD.
1212

13-
+ `solve(problem, Sophia(; η, βs, ϵ, λ, k, ρ))`
14-
15-
+ `η` is the learning rate
16-
+ `βs` are the decay of momentums
17-
+ `ϵ` is the epsilon value
18-
+ `λ` is the weight decay parameter
19-
+ `k` is the number of iterations to re-compute the diagonal of the Hessian matrix
20-
+ `ρ` is the momentum
21-
+ Defaults:
22-
23-
* `η = 0.001`
24-
* `βs = (0.9, 0.999)`
25-
* `ϵ = 1e-8`
26-
* `λ = 0.1`
27-
* `k = 10`
28-
* `ρ = 0.04`
13+
+ `solve(problem, Sophia(; η, βs, ϵ, λ, k, ρ))`
14+
15+
+ `η` is the learning rate
16+
+ `βs` are the decay of momentums
17+
+ `ϵ` is the epsilon value
18+
+ `λ` is the weight decay parameter
19+
+ `k` is the number of iterations to re-compute the diagonal of the Hessian matrix
20+
+ `ρ` is the momentum
21+
+ Defaults:
22+
23+
* `η = 0.001`
24+
* `βs = (0.9, 0.999)`
25+
* `ϵ = 1e-8`
26+
* `λ = 0.1`
27+
* `k = 10`
28+
* `ρ = 0.04`
2929

3030
## Examples
3131

0 commit comments

Comments
 (0)