Science Score: 77.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 4 DOI reference(s) in README -
✓Academic publication links
Links to: arxiv.org -
✓Committers with academic emails
1 of 3 committers (33.3%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.6%) to scientific vocabulary
Keywords
Repository
Proximal optimization in pure python
Basic Info
Statistics
- Stars: 118
- Watchers: 8
- Forks: 25
- Open Issues: 4
- Releases: 0
Topics
Metadata Files
README.md
Proximal Minimization
The methods in this package provide solvers for constrained optimization problems. All of them use proximal operators to deal with non-smooth penalty functions.
The algorithms:
- Proximal Gradient Method (PGM/ISTA): forward-backward splitting with a single smooth function with a Lipschitz-continuous gradient and a single (non-smooth) penalty function. Optional multi-block optimization, Nesterov acceleration (FISTA), and Barzilai-Borwein steps.
- Proximal Adam and derivatives (AdamX, AMSGrad, PAdam, NAdam): forward-backward splitting with adaptive gradient steps for single- and multi-block optimization.
- Alternating Direction Method of Multipliers (ADMM): Douglas-Rachford splitting for two potentially non-smooth functions. We use its linearized form to solve for linear mappings in the penalty functions.
- Simultaneous Direction Method of Multipliers (SDMM): Extension of linearized ADMM for several penalty functions.
- Block-Simultaneous Direction Method of Multipliers (bSDMM): Extension of SDMM to work with objective functions that are convex in several arguments. It's a proximal version of Block coordinate descent methods.
Two-block PGM or bSDMM is used as backend solvers for Non-negative Matrix Factorization (NMF). As the algorithms allow any proxable function as constraint on each of the matrix factors, we prefer the term Constrained Matrix Factorization.
Details can be found in the paper "Block-Simultaneous Direction Method of Multipliers - A proximal primal-dual splitting algorithm for nonconvex problems with multiple constraints" by Fred Moolekamp and Peter Melchior. We ask that any published work that utilizes this package cite this paper.
A description of the proximal Adam method is in the paper "Proximal Adam: Robust Adaptive Update Scheme for Constrained Optimization" by Peter Melchior, Rémy Joseph and Fred Moolekamp.
Installation and Dependencies
pip install proxmin
For the latest development version, clone this repository and execute python setup.py install.
The code works on python>2.7 and requires numpy and scipy. It is fully compatible with gradient computation by autograd.
Approach
The gradient-based methods PGM and Adam expect two callback function: one to compute the gradients, the other to compute step sizes. In the former case, the step sizes are bound between 0 and 2/L, where L is the Lipschitz constant of the gradient.
The penalty functions are given as proximal mappings: X <- prox(X, step).
Many proximal operators can be constructed analytically, see e.g. Parikh & Boyd (2014). We provide a number of common ones in proxmin.operators. An important class of constraints are indicator functions of convex sets, for which the proximal operator, given some point X, returns the closest point to X in the Euclidean norm that is in the set.
Example: find the minimum of a shifted parabola on the unit circle in 2D
```python import numpy as np import proxmin
dX = np.array([1.,0.5]) radius = 1
def f(X): """Shifted parabola""" return np.sum((X - dX)**2, axis=-1)
def grad_f(X): return 2*(X - dX)
def step_f(X, it=0): L = 2. # Lipschitz constant of grad f return 1 / L
def prox_circle(X, step): """Projection onto circle""" center = np.array([0,0]) dX = X - center # exclude everything other than perimeter of circle phi = np.arctan2(dX[1], dX[0]) return center + radius*np.array([np.cos(phi), np.sin(phi)])
X = np.array([-1.,-1.]) # or whereever converged, grad, step = proxmin.pgm(X, gradf, stepf, prox=prox_circle) ```
Since the objective function is smooth and there is only one constraint, one can simply perform a sequence of forward-backward steps: a step in gradient direction, followed by a projection onto the constraint subset. That is, in essence, the proximal gradient method.
If both functions are not smooth, one can use ADMM. It therefore operates on two proxed functions. Unlike PGM, feasibility is only achieved at the end of the optimization and only within some error tolerance.
Continuing the example above, the smooth function gets turned into a proxed function by performing the gradient step internally and returning the updated position:
```python def proxgradf(X, step): """Proximal gradient step""" return X-step*gradf(X)
converged = proxmin.admm(X, proxgradf, stepf, proxg=proxcircle, erel=1e-3, eabs=1e-3) ```
Constrained matrix factorization (CMF)
Matrix factorization seeks to approximate a target matrix Y as a product of np.dot(A,S). If those constraints are only non-negativity, the method is known as NMF.
We have extended the capabilities by allowing for an arbitrary number of constraints to be enforced on either matrix factor:
```python
PGM approach for each factor
proxA = ... # a single constraint on A, solved by projection proxS = ... # a single constraint on S, solved by projection A0, S0 = ... # initialization proxmin.nmf.nmf(Y, A0, S0, proxA=proxA, proxS=proxS)
same with AdaProx-AMSGrad
proxmin.nmf.nmf(Y, A0, S0, proxA=proxA, proxS=proxS, algorithm=proxmin.algorithms.adaprox, scheme="amsgrad")
for multiple constraints, solved by ADMM-style split
proxsg = [[...], # list of proxs for A [...]] # list of proxs for S A, S = proxmin.nmf.nmf(Y, A0, S0, algorithm=proxmin.algorithms.bsdmm, proxsg=proxs_g)
or a combination
A, S = proxmin.nmf.nmf(Y, A0, S0, algorithm=proxmin.algorithms.bsdmm, proxA=proxA, proxS=proxS, proxsg=proxsg) ```
Owner
- Name: Peter Melchior
- Login: pmelchior
- Kind: user
- Location: Princeton, NJ, USA
- Company: Princeton University
- Website: http://pmelchior.net
- Twitter: peter_melchior
- Repositories: 16
- Profile: https://github.com/pmelchior
Asst. Prof. of Statistical Astronomy
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "Moolekamp"
given-names: "Fred"
orcid: "https://orcid.org/0000-0003-0093-4279"
- family-names: "Melchior"
given-names: "Peter"
orcid: "https://orcid.org/0000-0002-8873-5065"
title: "proxmin"
url: "https://github.com/pmelchior/proxmin/"
preferred-citation:
type: article
authors:
- family-names: "Moolekamp"
given-names: "Fred"
orcid: "https://orcid.org/0000-0003-0093-4279"
- family-names: "Melchior"
given-names: "Peter"
orcid: "https://orcid.org/0000-0002-8873-5065"
doi: "10.1007/s11081-018-9380-y"
journal: "Optimization and Engineering"
volume: 19
start: 871 # First page number
end: 885 # Last page number
title: "Block-simultaneous direction method of multipliers: a proximal primal-dual splitting algorithm for nonconvex problems with multiple constraints"
year: 2018
month: 3
GitHub Events
Total
- Watch event: 7
Last Year
- Watch event: 7
Committers
Last synced: almost 3 years ago
All Time
- Total Commits: 250
- Total Committers: 3
- Avg Commits per committer: 83.333
- Development Distribution Score (DDS): 0.16
Top Committers
| Name | Commits | |
|---|---|---|
| Peter Melchior | p****r@g****m | 210 |
| fred3m | f****m@p****u | 39 |
| Fred Moolekamp | f****p@g****m | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 11
- Total pull requests: 14
- Average time to close issues: 14 days
- Average time to close pull requests: 6 days
- Total issue authors: 5
- Total pull request authors: 3
- Average comments per issue: 1.91
- Average comments per pull request: 0.57
- Merged pull requests: 13
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- pmelchior (7)
- pythonometrist (1)
- andy-slac (1)
- machanic (1)
- fred3m (1)
Pull Request Authors
- pmelchior (9)
- fred3m (4)
- brianv0 (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- pypi 140 last-month
- Total docker downloads: 3,884
- Total dependent packages: 2
- Total dependent repositories: 2
- Total versions: 19
- Total maintainers: 1
pypi.org: proxmin
Proximal methods for constrained optimization
- Homepage: https://github.com/pmelchior/proxmin
- Documentation: https://proxmin.readthedocs.io/
- License: MIT
-
Latest release: 0.6.12
published about 4 years ago
Rankings
Maintainers (1)
Dependencies
- numpy *
- scipy *