https://github.com/avik-pal/speedmapping.jl
General fixed point mapping acceleration and optimization in Julia
Science Score: 10.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
○codemeta.json file
-
○.zenodo.json file
-
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (8.7%) to scientific vocabulary
Last synced: 6 months ago
·
JSON representation
Repository
General fixed point mapping acceleration and optimization in Julia
Statistics
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
- Releases: 0
Fork of NicolasL-S/SpeedMapping.jl
Created 11 months ago
· Last pushed 11 months ago
https://github.com/avik-pal/SpeedMapping.jl/blob/main/
# SpeedMapping
[](https://github.com/NicolasL-S/SpeedMapping.jl/actions)
[](https://codecov.io/gh/NicolasL-S/SpeedMapping.jl)
SpeedMapping accelerates the convergence of a mapping to a fixed point by the Alternating cyclic extrapolation algorithm. Since gradient descent is an example of such mapping, it can also perform multivariate optimization based on the gradient function. Typical uses are
Accelerating a fixed-point mapping
```julia
julia> using SpeedMapping, LinearAlgebra
julia> function power_iteration!(x_out, x_in)
mul!(x_out, [1 2;2 3], x_in)
x_out ./= maximum(abs.(x_out))
end;
julia> dominant_eigenvector = speedmapping(ones(2); m! = power_iteration!).minimizer
2-element Vector{Float64}:
0.6180339887498947
1.0
```
Optimizing a function
```julia
julia> using SpeedMapping
julia> rosenbrock(x) = (1 - x[1])^2 + 100(x[2] - x[1]^2)^2;
julia> solution = speedmapping(zeros(2); f = rosenbrock).minimizer
2-element Vector{Float64}:
1.0000000000001315
0.9999999999999812
```
## Documentation
[](https://NicolasL-S.github.io/SpeedMapping.jl/stable)
### The Alternating cyclic extrapolation algorithm
Let *F* : denote a mapping which admits continuous, bounded partial derivatives. A *p*-order cyclic extrapolation may be expressed as
where
The extrapolation step size is and follows Aitken's notation. The algorithm alternates between *p* = 3 and *p* = 2. For gradient descent acceleration, is used to adjust the learning rate dynamically.
Reference:
N. Lepage-Saucier, _Alternating cyclic extrapolation methods for optimization algorithms_, arXiv:2104.04974 (2021). https://arxiv.org/abs/2104.04974
Owner
- Name: Avik Pal
- Login: avik-pal
- Kind: user
- Location: Cambridge, MA
- Company: Massachusetts Institute of Technology
- Website: https://avik-pal.github.io
- Twitter: avikpal1410
- Repositories: 46
- Profile: https://github.com/avik-pal
PhD Student @mit || Prev: BTech CSE IITK
GitHub Events
Total
- Push event: 1
- Pull request event: 1
Last Year
- Push event: 1
- Pull request event: 1