sgumlp
Pytorch implementation of the SGU-MLP Architecture (mostly) as described in the paper "Spatial Gated Multi-Layer Perceptron for Land Use and Land Cover Mapping".
Science Score: 67.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 8 DOI reference(s) in README -
✓Academic publication links
Links to: arxiv.org, acm.org, zenodo.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.0%) to scientific vocabulary
Keywords
Repository
Pytorch implementation of the SGU-MLP Architecture (mostly) as described in the paper "Spatial Gated Multi-Layer Perceptron for Land Use and Land Cover Mapping".
Basic Info
Statistics
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 1
Topics
Metadata Files
README.md
SGU-MLP: Pytorch Impelementation of the Spatial Gated Multilayer Perceptron
Pytorch implementation of the SGU-MLP Architecture from the paper "Spatial Gated Multi-Layer Perceptron for Land Use". The implementation follows the implementation of the original authors. It differs from the architecture described in the paper published on GitHub in the following aspects:
- Input patches and DWC Block outputs are combined using a residual connection.
- Patches are embedded using a Convolutional Layer (by default, the
embedding_kernel_sizeis set to 1 to achieve per pixel projections). - Input patches can be overlapping (this is only relevant for data preprocessing, not for general model usage).
Additionally, this implementation makes the initial residual weights configurable and learnable.
An implementation of MLP Mixer with optional usage of Spatial Gated Units in the MLP Blocks is also included. See src/sgu_mlp/models.py for details.
Basic Usage
````python from sgu_mlp import SGUMLPMixer import torch
height = width = 64 numpatches = height * width channels = 3 patchsize = 11
inputdimensions = (patchsize, patchsize, channels) patches = torch.randn(numpatches, *input_dimensions)
feature extractor
sgu = SGUMLPMixer( inputdimensions=inputdimensions, tokenfeatures=32, mixerfeatureschannel=64, mixerfeatures_sequence=96, ) out = sgu(patches)
(numpatches, patchsize**2, channels)
print(out.shape)
classifier
numclasses = 8
sguclf = SGUMLPMixer(
inputdimensions=inputdimensions,
tokenfeatures=32,
mixerfeatureschannel=64,
mixerfeaturessequence=96,
numclasses=numclasses
)
out = sguclf(patches)
(numpatches, numclasses)
print(out.shape) ````
Installation
If you just want to use SGU-MLP Architecture:
bash
pip install sgu-mlp
bash
pip install git+https://github.com/simulacrum6/sgu-mlp.git
If you want to run the experiments as well, add the [experiments] extras:
bash
pip install "sgu-mlp[experiments]"
bash
pip install "git+https://github.com/simulacrum6/sgu-mlp.git#egg=sgu-mlp[experiments]"
Running Experiments
To run the replication experiment, first download the benchmark datasets. You can download them from gDrive or run
bash
python3 -m experiments.run download --out_dir='/path/to/data/dir'
To run the replication experiment:
bash
python3 -m experiments.run experiment replication
Per default, it is assumed, that you run the script from the root of this repository (--root_dir='data/config').
To run a custom experiment:
bash
python3 -m experiments.run run <experiment_type> <cfg_path>
Arguments
- <experiment_type: "cv" or "ood".
- <cfg_path>: path to the config file for the experiment.
See data/config for examples. Of the config format.
References
- Spatial Gated Multi-Layer Perceptron for Land Use and Land Cover Mapping - Jamali et al. 2024
- Pay Attention to MLPs - Liu et al. 2021
- MLP-Mixer: An all-MLP Architecture for Vision - Tolstikhin et al. 2021
Citations
When using this software for your research, please cite the orginial article as well as this version of the software.
bibtex
@article{10399888,
author={Jamali, Ali and Roy, Swalpa Kumar and Hong, Danfeng and Atkinson, Peter M. and Ghamisi, Pedram},
journal={IEEE Geoscience and Remote Sensing Letters},
title={Spatial-Gated Multilayer Perceptron for Land Use and Land Cover Mapping},
year={2024},
volume={21},
number={},
pages={1-5},
keywords={Feature extraction;Classification algorithms;Hyperspectral imaging;Data models;Transformers;Biological system modeling;Training data;Attention mechanism;image classification;spatial gating unit (SGU);vision transformers (ViTs)},
doi={10.1109/LGRS.2024.3354175}}
bibtex
@software{hamacher2024sgumlp,
title = {SGU-MLP: Pytorch Implementation of the Spatial Gated Multi-Layer Perceptron},
author = {Hamacher, Marius},
year = {2025},
url = {https://github.com/simulacrum6/sgu-mlp},
version = {0.1.1},
note = {Pytorch Implementation of the SGU-MLP Architecture from the paper "Spatial Gated Multi-Layer Perceptron for Land Use and Land Cover Mapping"},
doi = {10.5281/zenodo.15227847}}
Owner
- Name: Marius Hamacher
- Login: simulacrum6
- Kind: user
- Location: Duisburg, Germany
- Company: FernUniversität in Hagen
- Repositories: 2
- Profile: https://github.com/simulacrum6
Researcher at CARTLPA LG Computational Linguistics at FernUniversiät in Hagen.
Citation (CITATION.cff)
cff-version: 1.2.0 type: software message: "If you use this software in your research, please cite it as below. Please also cite the original publication." authors: - family-names: "Hamacher" given-names: "Marius" title: "SGU-MLP: Pytorch Implementation of the Spatial Gated Multi-Layer Perceptron" version: 0.1.1 date-released: 2025-04-16 url: "https://github.com/simulacrum6/sgumlp" doi: https://doi.org/10.5281/zenodo.15227846
GitHub Events
Total
- Release event: 1
- Public event: 1
- Push event: 3
- Create event: 1
Last Year
- Release event: 1
- Public event: 1
- Push event: 3
- Create event: 1
Issues and Pull Requests
Last synced: 10 months ago
All Time
- Total issues: 0
- Total pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 0
- Total pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- 126 dependencies
- black ^24.10.0 develop
- hypothesis ^6.123.7 develop
- pytest ^8.3.4 develop
- gdown ^5.2.0 experiments
- lightning ^2.5.0.post0 experiments
- mlflow ^2.20.0 experiments
- pandas ^2.2.3 experiments
- patool ^3.1.0 experiments
- pillow ^11.1.0 experiments
- rasterio ^1.4.3 experiments
- scikit-learn ^1.6.0 experiments
- scipy ^1.15.0 experiments
- seaborn ^0.13.2 experiments
- torchmetrics ^1.6.1 experiments
- torchvision ^0.20.1 experiments
- numpy ^2.2.1
- python ^3.10
- torch ^2.5.1