compressedsensing.jl

Contains a wide-ranging collection of compressed sensing and feature selection algorithms. Examples include matching pursuit algorithms, forward and backward stepwise regression, sparse Bayesian learning, and basis pursuit.

https://github.com/sebastianament/compressedsensing.jl

Science Score: 28.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org, ieee.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (10.3%) to scientific vocabulary

Keywords

basis-pursuit compressed-sensing feature-selection julia matching-pursuit sparse-bayesian-learning sparse-linear-systems sparse-regression sparsity stepwise-regression subset-selection
Last synced: 6 months ago · JSON representation ·

Repository

Contains a wide-ranging collection of compressed sensing and feature selection algorithms. Examples include matching pursuit algorithms, forward and backward stepwise regression, sparse Bayesian learning, and basis pursuit.

Basic Info
  • Host: GitHub
  • Owner: SebastianAment
  • License: mit
  • Language: Julia
  • Default Branch: main
  • Homepage:
  • Size: 336 KB
Statistics
  • Stars: 30
  • Watchers: 2
  • Forks: 2
  • Open Issues: 1
  • Releases: 0
Topics
basis-pursuit compressed-sensing feature-selection julia matching-pursuit sparse-bayesian-learning sparse-linear-systems sparse-regression sparsity stepwise-regression subset-selection
Created over 5 years ago · Last pushed almost 4 years ago
Metadata Files
Readme License Citation

README.md

CompressedSensing.jl

CI codecov

Contains a wide-ranging collection of compressed sensing and feature selection algorithms. Examples include matching pursuit algorithms, forward and backward stepwise regression, sparse Bayesian learning, and basis pursuit.

Matching Pursuits

The package contains implementations of Matching Pursuit (MP), Orthogonal Matching Pursuit (OMP), and Generalized OMP (GOMP), all three of which take advantage of the efficient updating algorithms contained in UpdatableQRFactorizations.jl to compute the QR factorization of the atoms in the active set.

Stepwise Regression

  • Forward Regression

- Backward Regression

Two-Stage Algorithms

Sparse Bayesian Learning

Basis Pursuit

Basis Pursuit (BP) with reweighting schemes, like the ones related to entropy regularization and the Automatic Relevance Determination (ARD) or SBL prior.

Citing this Package

This package was written in the course of a research project on sparsity-promiting algorithms and was published with the paper Sparse Bayesian Learning via Stepwise Regression. Consider using the following citation, when referring to this package in a publication. bib @InProceedings{pmlr-v139-ament21a, title = {Sparse Bayesian Learning via Stepwise Regression}, author = {Ament, Sebastian E. and Gomes, Carla P.}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {264--274}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/ament21a/ament21a.pdf}, url = {https://proceedings.mlr.press/v139/ament21a.html}, abstract = {Sparse Bayesian Learning (SBL) is a powerful framework for attaining sparsity in probabilistic models. Herein, we propose a coordinate ascent algorithm for SBL termed Relevance Matching Pursuit (RMP) and show that, as its noise variance parameter goes to zero, RMP exhibits a surprising connection to Stepwise Regression. Further, we derive novel guarantees for Stepwise Regression algorithms, which also shed light on RMP. Our guarantees for Forward Regression improve on deterministic and probabilistic results for Orthogonal Matching Pursuit with noise. Our analysis of Backward Regression culminates in a bound on the residual of the optimal solution to the subset selection problem that, if satisfied, guarantees the optimality of the result. To our knowledge, this bound is the first that can be computed in polynomial time and depends chiefly on the smallest singular value of the matrix. We report numerical experiments using a variety of feature selection algorithms. Notably, RMP and its limiting variant are both efficient and maintain strong performance with correlated features.} }

Owner

  • Name: Sebastian Ament
  • Login: SebastianAment
  • Kind: user
  • Company: Meta

Research Scientist @ Meta

Citation (CITATION.bib)

@InProceedings{pmlr-v139-ament21a,
  title = 	 {Sparse Bayesian Learning via Stepwise Regression},
  author =       {Ament, Sebastian E. and Gomes, Carla P.},
  booktitle = 	 {Proceedings of the 38th International Conference on Machine Learning},
  pages = 	 {264--274},
  year = 	 {2021},
  editor = 	 {Meila, Marina and Zhang, Tong},
  volume = 	 {139},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {18--24 Jul},
  publisher =    {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v139/ament21a/ament21a.pdf},
  url = 	 {https://proceedings.mlr.press/v139/ament21a.html},
  abstract = 	 {Sparse Bayesian Learning (SBL) is a powerful framework for attaining sparsity in probabilistic models. Herein, we propose a coordinate ascent algorithm for SBL termed Relevance Matching Pursuit (RMP) and show that, as its noise variance parameter goes to zero, RMP exhibits a surprising connection to Stepwise Regression. Further, we derive novel guarantees for Stepwise Regression algorithms, which also shed light on RMP. Our guarantees for Forward Regression improve on deterministic and probabilistic results for Orthogonal Matching Pursuit with noise. Our analysis of Backward Regression culminates in a bound on the residual of the optimal solution to the subset selection problem that, if satisfied, guarantees the optimality of the result. To our knowledge, this bound is the first that can be computed in polynomial time and depends chiefly on the smallest singular value of the matrix. We report numerical experiments using a variety of feature selection algorithms. Notably, RMP and its limiting variant are both efficient and maintain strong performance with correlated features.}
}

GitHub Events

Total
  • Watch event: 4
  • Fork event: 1
Last Year
  • Watch event: 4
  • Fork event: 1

Committers

Last synced: 7 months ago

All Time
  • Total Commits: 47
  • Total Committers: 2
  • Avg Commits per committer: 23.5
  • Development Distribution Score (DDS): 0.085
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Sebastian Ament s****t@g****m 43
CompatHelper Julia c****y@j****g 4
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 3
  • Total pull requests: 4
  • Average time to close issues: 5 days
  • Average time to close pull requests: 21 days
  • Total issue authors: 3
  • Total pull request authors: 1
  • Average comments per issue: 2.67
  • Average comments per pull request: 0.0
  • Merged pull requests: 4
  • Bot issues: 0
  • Bot pull requests: 4
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • SebastianAment (1)
  • JuliaTagBot (1)
  • WillPowellUk (1)
Pull Request Authors
  • github-actions[bot] (4)
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads: unknown
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 2
juliahub.com: CompressedSensing

Contains a wide-ranging collection of compressed sensing and feature selection algorithms. Examples include matching pursuit algorithms, forward and backward stepwise regression, sparse Bayesian learning, and basis pursuit.

  • Versions: 2
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 9.9%
Stargazers count: 22.7%
Average: 28.0%
Dependent packages count: 38.9%
Forks count: 40.4%
Last synced: 6 months ago