afa-augment
Science Score: 54.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org, zenodo.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.2%) to scientific vocabulary
Repository
Basic Info
- Host: GitHub
- Owner: nis-research
- License: apache-2.0
- Language: Python
- Default Branch: main
- Size: 2.15 MB
Statistics
- Stars: 18
- Watchers: 1
- Forks: 1
- Open Issues: 2
- Releases: 0
Metadata Files
README.md
Auxiliary Fourier Augmentation
This repository contains the code for the paper "Fourier-basis Functions to Bridge Augmentation Gap: Rethinking Frequency Augmentation in Image Classification" accepted at CVPR 2024.
Introduction
We propose Auxiliary Fourier-basis Augmentation (AFA), a complementary technique targeting augmentation in the frequency domain and filling the robustness gap left by visual augmentations. We demonstrate the utility of augmentation via Fourier-basis additive noise in a straightforward and efficient adversarial setting. Our results show that AFA benefits the robustness of models against common corruptions, OOD generalization, and consistency of performance of models against increasing perturbations, with negligible deficit to the standard performance of models over various benchmarks and resolutions. It can be seamlessly integrated with other augmentation techniques to further boost performance.
For more details see our CVPR 2024 paper: Fourier-basis Functions to Bridge Augmentation Gap: Rethinking Frequency Augmentation in Image Classification
Schema

Contents
This directory includes a reference implementation in PyTorch of the augmentation method used in AFA.
We also include PyTorch re-implementations of AFA on both CIFAR-10/100 and ImageNet which both support training and evaluation on CIFAR-10/100-C and ImageNet-C.
Experiment Setups
The following snippets are an example of how to use the ConfigBuilder to create a config object
```python
experiments = [ # This creates the experimental setup for training ImageNet using ResNet50-Dubin model # The training is done with the AFA attack and the mean strength is set to 10 and the minimum strength is set to 0 # It does not use JSD and so defaults to use of ACE loss as there is an attack specified # It does not use mix like CutMix or MixUp # No other augmentations are used { 'ds': 'in', 'm': 'rn50dubin', 'usejsd': False, 'useprime': False, 'useaugmix': False, 'inmix': False, 'usemix': False, 'usefourier': False, 'useapr': False, 'attack': 'afa', 'minstr': 0., 'meanstr': 10., },
# This creates the experimental setup for training ImageNet using CCT model
# The training is done with the AFA attack and the mean strength is set to 10 and the minimum strength is set to 0
# It does not use JSD and so defaults to use of ACE loss as there is an attack specified
# It uses mix like CutMix or MixUp
# It uses AugMix besides the AFA augmentation
{
'ds': 'in', 'm': 'cct_14_7x2_224', 'use_jsd': False,
'use_prime': False, 'use_augmix': True, 'in_mix': False, 'use_mix': True,
'use_fourier': False, 'use_apr': False, 'attack': 'afa', 'min_str': 0., 'mean_str': 10.,
},
] ```
The experiment variable is a list of dictionaries, each dictionary represents an experimental setup.
Specify the experiment list in the main.py file and run the file to start the experiments.
Look at config_utils.py for more details on the ConfigBuilder class and experimental setups.
Running the Experiments
First install the requirements using the following command:
bash
pip install -r requirements.txt
Then, construct the config object using the ConfigBuilder class and specify the experiments in the main.py file.
This is shown above.
To run the experiments, use the following command:
bash
python main.py
Requirements
- PyTorch
- Numpy
- Matplotlib
- einops, opt_einsum
- tqdm
- ml-collections
- torchvision
- pytorch_lightning
- wandb
- torchmetrics
- thop
Pretrained Models
The process to load pretrained models for ImageNet can be found here using the load_weights.py script. Pretrained weights can be downloaded as decribed below:
For ImageNet, all models have been moved to Zenodo and can be downloaded here. For CIFAR-10, all model weights are available here.
Evaluations
We refer to: CorruptionBenchCV for the corruption benchmark tests on ImageNet-C, ImageNet-\bar{C}, ImageNet-3DCC and ImageNet-P, and the ImageNet-v2 and ImageNet-R for the OOD tests.
The Fourier heatmaps of models are plotted using this repository.
Citation
If you find this repository useful, please consider citing our paper:
@inproceedings{afa,
title={Fourier-basis functions to bridge augmentation gap: Rethinking frequency augmentation in image classification},
author={Vaish, Puru and Wang, Shunxin and Strisciuglio, Nicola},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={17763--17772},
year={2024}
}
License
This repository is released under the Apache 2.0 license. See LICENSE for more details.
Owner
- Name: Nicola Strisciuglio's team @UTwente
- Login: nis-research
- Kind: user
- Location: Netherlands
- Company: University of Twente
- Twitter: nicstrisc
- Repositories: 8
- Profile: https://github.com/nis-research
Robust Computer Vision, Neural Networks and Intelligent Systems research @UTwente (leader: Nicola Strisciuglio)
Citation (CITATION.cff)
cff-version: 1.2.0
message: If you use this software, please cite both the article from preferred-citation and the software itself.
authors:
- family-names: Vaish
given-names: Puru
- family-names: Wang
given-names: Shunxin
- family-names: Strisciuglio
given-names: Nicola
title: 'Fourier-basis functions to bridge augmentation gap: Rethinking frequency augmentation in image classification'
version: 1.0.0
date-released: '2024-07-12'
preferred-citation:
authors:
- family-names: Vaish
given-names: Puru
orcid: "https://orcid.org/0000-0002-5180-5293"
- family-names: Wang
given-names: Shunxin
orcid: "https://orcid.org/0000-0002-6105-5545"
- family-names: Strisciuglio
given-names: Nicola
orcid: "https://orcid.org/0000-0002-7478-3509"
title: 'Fourier-basis functions to bridge augmentation gap: Rethinking frequency augmentation in image classification'
doi: 10.48550/arXiv.2403.01944
type: conference-paper
pages: 17763--17772
year: '2024'
collection-title: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition
conference: {CVPR 2024}
publisher: {IEEE/CVF}
GitHub Events
Total
- Issues event: 10
- Watch event: 7
- Issue comment event: 7
- Push event: 1
- Fork event: 1
Last Year
- Issues event: 10
- Watch event: 7
- Issue comment event: 7
- Push event: 1
- Fork event: 1
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 4
- Total pull requests: 0
- Average time to close issues: about 6 hours
- Average time to close pull requests: N/A
- Total issue authors: 1
- Total pull request authors: 0
- Average comments per issue: 0.25
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 4
- Pull requests: 0
- Average time to close issues: about 6 hours
- Average time to close pull requests: N/A
- Issue authors: 1
- Pull request authors: 0
- Average comments per issue: 0.25
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- sahanHe (5)
- Carinazhao22 (1)
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- einops *
- matplotlib *
- ml-collections *
- numpy *
- opt_einsum *
- pytorch_lightning *
- thop *
- torch *
- torchmetrics *
- torchvision *
- tqdm *
- wandb *