gelureluinterpolation

Leveraging Continuously Differentiable Activation for Learning in Analog and Quantized Noisy Environments

https://github.com/vivswan/gelureluinterpolation

Science Score: 67.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (7.8%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Leveraging Continuously Differentiable Activation for Learning in Analog and Quantized Noisy Environments

Basic Info
Statistics
  • Stars: 1
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Created about 2 years ago · Last pushed about 1 year ago
Metadata Files
Readme License Citation

README.md

GeLUReLUInterpolation

This is the official repository for the paper:
Leveraging Continuously Differentiable Activation for Learning in Analog and Quantized Noisy Environments

Requirements

The following packages are required to run the simulation:

Run the simulation

Datasets

  • CIFAR-10 and CIFAR-100 datasets are used in the experiments. The datasets are automatically downloaded by the PyTorch library.

Models

  • ConvNet: Model with 6 convolutional layers and 3 fully connected layers. Run this model using src/run_conv.py script.
  • ResNet: Run this model using src/run_resnet.py script.
  • VGG: Run this model using src/run_vgg.py script.
  • ViT: Run this model using src/run_vit.py script.

Cite

We would appreciate if you cite the following paper in your publications if you find this code useful:

bibtex @article{shah2024leveraging, title={Leveraging Continuously Differentiable Activation Functions for Learning in Quantized Noisy Environments}, author={Shah, Vivswan and Youngblood, Nathan}, journal={arXiv preprint arXiv:2402.02593}, url = {http://arxiv.org/abs/2402.02593}, doi = {10.48550/arXiv.2402.02593}, year={2024} }

Or in textual form:

text Shah, Vivswan, and Nathan Youngblood. "Leveraging Continuously Differentiable Activation Functions for Learning in Quantized Noisy Environments." arXiv preprint arXiv:2402.02593 (2024).

Owner

  • Name: Vivswan Shah
  • Login: Vivswan
  • Kind: user
  • Company: University of Pittsburgh

PhD Student @ Upitt in Machine Learning and Quantum Computing

Citation (CITATION.cff)

cff-version: 1.2.0
message: If you use this software, please cite both the article from preferred-citation and the software itself.
authors:
  - family-names: Shah
    given-names: Vivswan
  - family-names: Youngblood
    given-names: Nathan
title: Leveraging Continuously Differentiable Activation Functions for Learning in Quantized Noisy Environments
version: 1.0.0
url: http://arxiv.org/abs/2402.02593
doi: 10.48550/arXiv.2402.02593
date-released: '2024-12-09'
preferred-citation:
  authors:
    - family-names: Shah
      given-names: Vivswan
    - family-names: Youngblood
      given-names: Nathan
  title: Leveraging Continuously Differentiable Activation Functions for Learning in Quantized Noisy Environments
  doi: 10.48550/arXiv.2402.02593
  url: http://arxiv.org/abs/2402.02593
  type: article-journal
  year: '2024'
  conference: {}
  publisher: {}

GitHub Events

Total
  • Watch event: 1
  • Public event: 1
  • Push event: 2
Last Year
  • Watch event: 1
  • Public event: 1
  • Push event: 2

Dependencies

requirements.txt pypi
  • analogvnn *
  • einops *
  • graphviz *
  • matplotlib *
  • natsort *
  • numpy *
  • portalocker *
  • scipy *
  • seaborn *
  • tabulate *
  • torchdata *
  • torchinfo *
  • torchviz *
  • tqdm *