proxtorch

An efficient GPU-compatible library built on PyTorch, offering a wide range of proximal operators and constraints for optimization and machine learning tasks.

https://github.com/jameschapman19/proxtorch

Science Score: 77.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 3 DOI reference(s) in README
  • Academic publication links
    Links to: zenodo.org
  • Committers with academic emails
    1 of 1 committers (100.0%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.0%) to scientific vocabulary

Keywords

proximal-gradient-descent proximal-operator proximal-operators proximal-regularization
Last synced: 6 months ago · JSON representation ·

Repository

An efficient GPU-compatible library built on PyTorch, offering a wide range of proximal operators and constraints for optimization and machine learning tasks.

Basic Info
Statistics
  • Stars: 5
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 10
Topics
proximal-gradient-descent proximal-operator proximal-operators proximal-regularization
Created over 2 years ago · Last pushed over 2 years ago
Metadata Files
Readme License Citation

README.md

ProxTorch Logo # ProxTorch **Unleashing Proximal Gradient Descent on PyTorch** 🚀 [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.5748062.svg)](https://doi.org/10.5281/zenodo.4382739) [![codecov](https://codecov.io/gh/jameschapman19/ProxTorch/graph/badge.svg?token=909RDXcEZK)](https://codecov.io/gh/jameschapman19/ProxTorch) [![version](https://img.shields.io/pypi/v/ProxTorch)](https://pypi.org/project/ProxTorch/) [![downloads](https://img.shields.io/pypi/dm/ProxTorch)](https://pypi.org/project/ProxTorch/)

🔍 What is ProxTorch?
Dive into a rich realm of proximal operators and constraints with ProxTorch, a state-of-the-art Python library crafted on PyTorch. Whether it's optimization challenges or the complexities of machine learning, ProxTorch is designed for speed, efficiency, and seamless GPU integration.

Features

  • 🚀 GPU-Boosted: Experience lightning-fast computations with extensive CUDA support.
  • 🔥 PyTorch Synergy: Naturally integrates with all your PyTorch endeavours.
  • 📚 Expansive Library: From elemental norms (L0, L1, L2, L∞) to advanced regularizations like Total Variation and Fused Lasso.
  • 🤝 User-Friendly: Jump right in! Intuitive design means minimal disruptions to your existing projects.

🛠 Installation

Getting started with ProxTorch is a breeze. Install from PyPI with:

bash pip install proxtorch

Or install from source with:

bash git clone cd ProxTorch pip install -e .

🚀 Quick Start

Dive in with this straightforward example:

```python import torch from proxtorch.operators import L1

Define a sample tensor

x = torch.tensor([0.5, -1.2, 0.3, -0.4, 0.7])

Initialize the L1 proximal operator

l1_prox = L1(sigma=0.1)

Compute the regularization component value

regvalue = l1prox(x) print("Regularization Value:", reg_value)

Apply the proximal operator

result = l1_prox.prox(x) print("Prox Result:", result) ```

📜 Diverse Proximal Operators

Regularizers

  • L1, L2 (Ridge), ElasticNet, GroupLasso, TV (includes TV2D, TV3D, TVL12D, TVL13D), *Frobenius *
  • Norms: TraceNorm, NuclearNorm
  • FusedLasso, Huber

Constraints

  • L0Ball, L1Ball, L2Ball, L∞Ball (Infinity Norm), Frobenius, TraceNorm, Box

📖 Documentation

Explore the comprehensive documentation on Read the Docs.

🙌 Credits

ProxTorch stands on the shoulders of giants:

We're thrilled to introduce ProxTorch as an exciting addition to the PyTorch ecosystem. We're confident you'll love it!

🤝 Contribute to the ProxTorch Revolution

Got ideas? Join our vibrant community and make ProxTorch even better!

📜 License

ProxTorch is proudly released under the MIT License.

```

Owner

  • Name: James Chapman
  • Login: jameschapman19
  • Kind: user
  • Location: London
  • Company: UCL

Studying for a PhD in Machine Learning and Neuroimaging at University College London (UCL)

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "Chapman"
  given-names: "James"
  orcid: "https://orcid.org/0000-0002-9364-8118"
title: "ProxTorch"
version: 0.0.8
date-released: 2023-09-02
url: "https://github.com/jameschapman19/proxtorch"

GitHub Events

Total
Last Year

Committers

Last synced: 10 months ago

All Time
  • Total Commits: 131
  • Total Committers: 1
  • Avg Commits per committer: 131.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
jameschapman19 j****9@u****k 131
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 0
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 0
  • Total pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 48 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 10
  • Total maintainers: 1
pypi.org: proxtorch

ProxTorch is a PyTorch library for proximal operators.

  • Versions: 10
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 48 Last month
Rankings
Dependent packages count: 7.5%
Forks count: 30.3%
Average: 36.7%
Stargazers count: 39.2%
Dependent repos count: 69.6%
Maintainers (1)
Last synced: 6 months ago

Dependencies

.github/workflows/changes.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
  • codecov/codecov-action v3 composite
  • stefanzweifel/git-auto-commit-action v4 composite
.github/workflows/draft-pdf.yml actions
  • actions/checkout v3 composite
  • actions/upload-artifact v1 composite
  • openjournals/openjournals-draft-action master composite
.github/workflows/publish.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
docs/source/requirements.txt pypi
  • matplotlib *
  • pydata_sphinx_theme *
  • pytorch-lightning *
  • scikit-learn *
  • sphinx-autodoc-typehints *
  • sphinx-gallery *
  • sphinx_rtd_theme ==1.2.0
  • torch *
poetry.lock pypi
  • black 23.7.0
  • certifi 2023.7.22
  • charset-normalizer 3.2.0
  • click 8.1.6
  • codecov 2.1.13
  • colorama 0.4.6
  • coverage 7.2.7
  • exceptiongroup 1.1.2
  • filelock 3.12.2
  • flake8 5.0.4
  • idna 3.4
  • iniconfig 2.0.0
  • jinja2 3.1.2
  • markupsafe 2.1.3
  • mccabe 0.7.0
  • mpmath 1.3.0
  • mypy-extensions 1.0.0
  • networkx 3.1
  • packaging 23.1
  • pathspec 0.11.2
  • platformdirs 3.10.0
  • pluggy 1.2.0
  • pycodestyle 2.9.1
  • pyflakes 2.5.0
  • pytest 7.4.0
  • pytest-cov 4.1.0
  • requests 2.31.0
  • sympy 1.12
  • tomli 2.0.1
  • torch 2.0.1
  • torch 2.0.1+cpu
  • typing-extensions 4.7.1
  • urllib3 2.0.4
pyproject.toml pypi
  • black * develop
  • codecov * develop
  • flake8 * develop
  • pytest-cov * develop
  • python >=3.8,<4.0.0
  • torch --- - !ruby/hash:ActiveSupport::HashWithIndifferentAccess version: "^2.0.1" platform: darwin - !ruby/hash:ActiveSupport::HashWithIndifferentAccess version: "^2.0.1" platform: linux source: torch - !ruby/hash:ActiveSupport::HashWithIndifferentAccess version: "^2.0.1" platform: win32 source: torch