compression-aware-sfw
Code to reproduce the experiments of "Compression-aware Training of Neural Networks using Frank-Wolfe"
Science Score: 41.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
○.zenodo.json file
-
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (7.6%) to scientific vocabulary
Keywords
Repository
Code to reproduce the experiments of "Compression-aware Training of Neural Networks using Frank-Wolfe"
Basic Info
Statistics
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
Compression-aware Training of Neural Networks using Frank-Wolfe
Authors: Max Zimmer, Christoph Spiegel, Sebastian Pokutta
This repository contains the code to reproduce the experiments from the "Compression-aware Training of Neural Networks using Frank-Wolfe" (arXiv:2205.11921) paper. The code is based on PyTorch 1.9 and the experiment-tracking platform Weights & Biases.
Structure and Usage
Experiments are started from the following file:
- main.py: Starts experiments using the dictionary format of Weights & Biases.
The rest of the project is structured as follows:
- strategies: Contains all used sparsification methods.
- runners: Contains classes to control the training and collection of metrics.
- metrics: Contains all metrics as well as FLOP computation methods.
- models: Contains all model architectures used.
- optimizers: Contains reimplementations of SFW, SGD and Proximal SGD.
Citation
In case you find the paper or the implementation useful for your own research, please consider citing:
@Article{zimmer2022,
author = {Max Zimmer and Christoph Spiegel and Sebastian Pokutta},
title = {Compression-aware Training of Neural Networks using Frank-Wolfe},
year = {2022},
archiveprefix = {arXiv},
eprint = {2205.11921},
primaryclass = {cs.LG},
}
Owner
- Name: IOL Lab
- Login: ZIB-IOL
- Kind: organization
- Location: Germany
- Website: https://iol.zib.de
- Repositories: 27
- Profile: https://github.com/ZIB-IOL
Working on optimization and learning at the intersection of mathematics and computer science
Citation (citation.bib)
@Article{zimmer2022,
author = {Max Zimmer and Christoph Spiegel and Sebastian Pokutta},
title = {Compression-aware Training of Neural Networks using Frank-Wolfe},
year = {2022},
archiveprefix = {arXiv},
eprint = {2205.11921},
primaryclass = {cs.LG},
}
GitHub Events
Total
Last Year
Issues and Pull Requests
Last synced: 11 months ago
All Time
- Total issues: 0
- Total pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 0
- Total pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0