Science Score: 77.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 3 DOI reference(s) in README -
✓Academic publication links
Links to: acm.org -
✓Committers with academic emails
1 of 4 committers (25.0%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (11.4%) to scientific vocabulary
Keywords
Keywords from Contributors
Repository
Multivariate Dictionary Learning Algorithm
Basic Info
Statistics
- Stars: 24
- Watchers: 2
- Forks: 12
- Open Issues: 6
- Releases: 3
Topics
Metadata Files
README.md
MDLA - Multivariate Dictionary Learning Algorithm
Dictionary Learning for the multivariate dataset
This dictionary learning variant is tailored for dealing with multivariate datasets and especially timeseries, where samples are matrices and the dataset is seen as a tensor. Dictionary Learning Algorithm (DLA) decompose input vector on a dictionary matrix with a sparse coefficient vector, see (a) on figure below. To handle multivariate data, a first approach called multichannel DLA, see (b) on figure below, is to decompose the matrix vector on a dictionary matrix but with sparse coefficient matrices, assuming that a multivariate sample could be seen as a collection of channels explained by the same dictionary. Nonetheless, multichannel DLA breaks the "spatial" coherence of multivariate samples, discarding the column-wise relationship existing in the samples. Multivariate DLA, (c), on figure below, decompose the matrix input on a tensor dictionary, where each atom is a matrix, with sparse coefficient vectors. In this case, the spatial relationship are directly encoded in the dictionary, as each atoms has the same dimension than an input samples.

(figure from Chevallier et al., 2014 )
To handle timeseries, two major modifications are brought to DLA:
- extension to multivariate samples
- shift-invariant approach, The first point is explained above. To implement the second one, there is two possibility, either slicing the input timeseries into small overlapping samples or to have atoms smaller than input samples, leading to a decomposition with sparse coefficients and offsets. In the latter case, the decomposition could be seen as sequence of kernels occuring at different time steps.

(figure from Smith & Lewicki, 2005)
The proposed implementation is an adaptation of the work of the following authors:
- Q. Barthélemy, A. Larue, A. Mayoue, D. Mercier, and J.I. Mars. Shift & 2D rotation invariant sparse coding for multi- variate signal. IEEE Trans. Signal Processing, 60:1597–1611, 2012.
- Q. Barthélemy, A. Larue, and J.I. Mars. Decomposition and dictionary learning for 3D trajectories. Signal Process., 98:423–437, 2014.
- Q. Barthélemy, C. Gouy-Pailler, Y. Isaac, A. Souloumiac, A. Larue, and J.I. Mars. Multivariate temporal dictionary learning for EEG. Journal of Neuroscience Methods, 215:19–28, 2013.
Dependencies
The only dependencies are scikit-learn, matplotlib, numpy and scipy.
No installation is required.
Example
A straightforward example is:
```python import numpy as np from mdla import MultivariateDictLearning from mdla import multivariatesparseencode from numpy.linalg import norm
rngglobal = np.random.RandomState(0) nsamples, nfeatures, ndims = 10, 5, 3 X = rngglobal.randn(nsamples, nfeatures, ndims)
nkernels = 8 dico = MultivariateDictLearning(nkernels=nkernels, maxiter=10).fit(X) residual, code = multivariatesparseencode(X, dico) print ('Objective error for each samples is:') for i in range(len(residual)): print ('Sample', i, ':', norm(residual[i], 'fro') + len(code[i])) ```
Bibliography
- Chevallier, S., Barthelemy, Q., & Atif, J. (2014). Subspace metrics for multivariate dictionaries and application to EEG. In Acoustics, Speech and Signal Processing (ICASSP), IEEE International Conference on (pp. 7178-7182).
- Smith, E., & Lewicki, M. S. (2005). Efficient coding of time-relative structure using spikes. Neural Computation, 17(1), 19-45
- Chevallier, S., Barthélemy, Q., & Atif, J. (2014). On the need for metrics in dictionary learning assessment. In European Signal Processing Conference (EUSIPCO), pp. 1427-1431.
Owner
- Name: Sylvain Chevallier
- Login: sylvchev
- Kind: user
- Location: Gif-sur-Yvette, France
- Company: LISN, Université Paris-Saclay
- Website: https://sylvchev.github.io/
- Twitter: sylvcheva
- Repositories: 43
- Profile: https://github.com/sylvchev
Brain-computer interface, machine learning, little bit of geometry, teaching, GNU/Linux and open source.
Citation (CITATION.cff)
cff-version: 1.2.0 message: "If you use this software, please cite it as below." authors: - family-names: "Sylvain" given-names: "Chevallier" orcid: "https://orcid.org/0000-0003-3027-8241" title: "Multivariate Dictionary Learning Algorithm" version: 1.0.3 doi: 10.5281/zenodo.8434917 date-released: 2023-10-12 url: "https://github.com/sylvchev/mdla"
GitHub Events
Total
- Watch event: 1
Last Year
- Watch event: 1
Committers
Last synced: 9 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Sylvain Chevallier | s****r@u****r | 161 |
| dependabot[bot] | 4****] | 28 |
| Sylvain Chevallier | s****v@g****m | 7 |
| Sylvain Chevallier | s****r@u****r | 2 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 7 months ago
All Time
- Total issues: 3
- Total pull requests: 81
- Average time to close issues: 9 days
- Average time to close pull requests: about 1 month
- Total issue authors: 1
- Total pull request authors: 2
- Average comments per issue: 3.33
- Average comments per pull request: 1.16
- Merged pull requests: 40
- Bot issues: 0
- Bot pull requests: 68
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- chapochn (3)
Pull Request Authors
- dependabot[bot] (70)
- sylvchev (13)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- pypi 18 last-month
- Total dependent packages: 0
- Total dependent repositories: 1
- Total versions: 3
- Total maintainers: 1
pypi.org: mdla
Multivariate Dictionary Learning Algorithm
- Homepage: https://github.com/sylvchev/mdla
- Documentation: http://github.com/sylvchev/mdla
- License: BSD-3-Clause
-
Latest release: 1.0.2
published about 4 years ago
Rankings
Maintainers (1)
Dependencies
- atomicwrites 1.4.0 develop
- attrs 21.2.0 develop
- backports.entry-points-selectable 1.1.1 develop
- cfgv 3.3.1 develop
- colorama 0.4.4 develop
- coverage 6.3.2 develop
- distlib 0.3.3 develop
- filelock 3.4.0 develop
- identify 2.4.0 develop
- importlib-metadata 4.8.2 develop
- iniconfig 1.1.1 develop
- nodeenv 1.6.0 develop
- platformdirs 2.4.0 develop
- pluggy 1.0.0 develop
- pre-commit 2.17.0 develop
- py 1.11.0 develop
- pytest 7.1.1 develop
- pytest-cov 3.0.0 develop
- pyyaml 6.0 develop
- toml 0.10.2 develop
- typing-extensions 4.0.0 develop
- virtualenv 20.10.0 develop
- zipp 3.6.0 develop
- cvxopt 1.3.0
- cycler 0.11.0
- fonttools 4.28.1
- joblib 1.1.0
- kiwisolver 1.3.2
- matplotlib 3.5.1
- numpy 1.21.1
- packaging 21.3
- pillow 9.0.1
- pyparsing 3.0.6
- python-dateutil 2.8.2
- scikit-learn 1.0.2
- scipy 1.6.1
- setuptools-scm 6.3.2
- six 1.16.0
- threadpoolctl 3.0.0
- tomli 1.2.2
- pre-commit ^2.17.0 develop
- pytest ^7.1.1 develop
- pytest-cov ^3 develop
- cvxopt ^1.2.0
- matplotlib ^3.0
- numpy ^1.19.0
- python ^3.7
- scikit-learn ^1.0
- scipy ^1.5
- cvxopt ==1.2.7
- cycler ==0.11.0
- fonttools ==4.28.1
- joblib ==1.1.0
- kiwisolver ==1.3.2
- matplotlib ==3.5.0
- numpy ==1.21.1
- packaging ==21.3
- pillow ==8.4.0
- pyparsing ==3.0.6
- python-dateutil ==2.8.2
- scikit-learn ==1.0.1
- scipy ==1.6.1
- setuptools-scm ==6.3.2
- six ==1.16.0
- threadpoolctl ==3.0.0
- tomli ==1.2.2
- actions/checkout v2 composite
- actions/setup-python v2 composite
- codecov/codecov-action v1 composite
- pre-commit/action v2.0.0 composite