https://github.com/ammarlodhi255/fine-grained-approach-to-wrist-pathology-recognition
This repository contains the official code for the paper "Learning from the Few: Fine-grained Approach to Wrist Pathology Recognition on a Limited Dataset".
https://github.com/ammarlodhi255/fine-grained-approach-to-wrist-pathology-recognition
Science Score: 36.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
○.zenodo.json file
-
✓DOI references
Found 2 DOI reference(s) in README -
✓Academic publication links
Links to: sciencedirect.com, nature.com -
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (7.7%) to scientific vocabulary
Keywords
Repository
This repository contains the official code for the paper "Learning from the Few: Fine-grained Approach to Wrist Pathology Recognition on a Limited Dataset".
Basic Info
- Host: GitHub
- Owner: ammarlodhi255
- License: mit
- Language: Python
- Default Branch: main
- Homepage: https://www.sciencedirect.com/science/article/pii/S0010482524011296
- Size: 4.68 MB
Statistics
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
Learning from the Few: Fine-grained Approach to Pediatric Wrist Pathology Recognition on a Limited Dataset
Paper URL: https://www.sciencedirect.com/science/article/pii/S0010482524011296
We approach wrist pathology recognition as a fine-grained recognition problem. We refine our fine-grained architecture through ablation analysis and the integration of LION optimization. Leveraging Grad-CAM, an explainable AI technique, we highlight these regions. Despite using limited data, reflective of real-world medical study constraints, our method consistently outperforms state-of-the-art image recognition models on both augmented and original (challenging) test sets. Our proposed refined architecture achieves an increase in accuracy of 1.06% and 1.25% compared to the baseline method, resulting in accuracies of 86% and 84%, respectively. Moreover, our approach demonstrates the highest fracture sensitivity of 97%, highlighting its potential to enhance wrist pathology recognition.

Evaluation Against Other Deep Neural Networks
| Model | Test Accuracy (%) | |----------------|-------------------| | EfficientNetV2 | 53.59 | | NFNet | 65.40 | | VGG16 | 65.82 | | ViT | 70.25 | | DeiT3 | 70.89 | | RegNet | 72.36 | | DenseNet201 | 73.42 | | MobileNetV2 | 76.37 | | CMAL | 76.58 | | RexNet100 | 77.43 | | ResNet101 | 77.43 | | IELT | 78.10 | | DenseNet121 | 78.21 | | ResNest101e | 78.27 | | InceptionV4 | 78.69 | | MetaFormer | 78.90 | | ResNet50 | 79.11 | | InceptionV3 | 79.54 | | EfficientNet_b0 | 79.96 | | YOLOv8x | 80.50 | | HERBS | 82.70 | | Our Approach (PIM for FGVR) | 84.38 |
LION Integration and FPN Adjustment
| Model | Test Set 1 Accuracy (%) | Test Set 2 Accuracy (%) | |----------------|-------------------------|-------------------------| | PIM | 84.38 | 82.50 | | PIM + LION | 85.44 | 83.75 |
1. Environment setting
1.0. Package
- Install Requirements
pip install -r requirements.txt
1.1. Dataset
In this paper, we use a dataset curated from GRAZPEDWRI-DX. The curated dataset can be found at: * Curated Dataset
1.2. Our Pre-trained Model
The weights of the refined FGVR model can be found at: * Weights
1.3. OS
- [x] Windows10
- [x] Ubuntu20.04
- [x] macOS (CPU only)
2. Train
- [x] Single GPU Training
- [x] DataParallel (single machine multi-gpus)
- [ ] DistributedDataParallel
(more information: https://pytorch.org/tutorials/intermediate/ddp_tutorial.html)
2.1. Data
train, val, and test data structures:
├── train/
│ ├── 0/
│ | ├── 0/0133_0306769778_07_WRI-R2_M015-1.png
│ | ├── 0133_0306769778_07_WRI-R2_M015-3.png
│ | └── ....
│ ├── 1/
│ | ├── 0025_0483842914_01_WRI-L2_F000.png
│ | ├── 0053_1119833109_03_WRI-R1_F005.png
│ | └── ....
│ └── ....
└── val/
│ ├── 0/
│ | ├── 0133_0306769778_07_WRI-R2_M015-0.png
│ | ├── 0133_0306769778_07_WRI-R2_M015-2.png
│ | └── ....
│ ├── 1/
│ | ├── 0042_0827512771_04_WRI-R2_M015.png
│ | ├── 0071_0680563744_02_WRI-R1_F009.png
│ | └── ....
│ └── ....
└── test/
│ ├── 0/
│ | ├── 0772_0547017117_03_WRI-R1_M017-0.png
│ | ├── 0772_0547017117_03_WRI-R1_M017-1.png
│ | └── ....
│ ├── 1/
│ | ├── 0069_0502540283_01_WRI-L1_M013.png
│ | ├── 0078_1212376595_01_WRI-L1_M011.png
│ | └── ....
│ └── ....
└── test2/
│ ├── 0/
│ | ├── 0772_0547017117_03_WRI-R1_M017.png
│ | ├── 0834_0240036198_01_WRI-R1_M014.png
│ | └── ....
│ ├── 1/
│ | ├── 0069_0502540283_01_WRI-L1_M013.png
│ | ├── 0115_0432451427_01_WRI-L2_M004.png
│ | └── ....
│ └── ....
2.2. Configuration
All the important configuration files are located at config folder. You can directly modify yaml file if needed.
2.3. Run
python main.py --c ./configs/wrist.yaml
model will save in ./records/{projectname}/{expname}/backup/
2.5. Multi-GPUs
comment out main.py line 66
model = torch.nn.DataParallel(model, device_ids=None)
3. Evaluation
If you want to evaluate our pre-trained model or your custom-trained model (specify the path to your model in the config file):
python main.py --c ./configs/eval.yaml
Results will be saved in ./records/{projectname}/{expname}/eval_results.txt
4. Heatmap Generation
python heat.py --c ./configs/wrist.yaml --img data/test/1/0069_0502540283_01_WRI-L1_M013.png --save_img ./0069_0502540283_01_WRI-L1_M013.png

5. Inference
If you want to reason your picture and get the confusion matrix, please provide configs/eval.yaml
python infer.py --c ./configs/eval.yaml
Results will be saved in ./records/{projectname}/{expname}/infer_results.txt
Citation
If you find our paper useful in your research, please consider citing:
@article{AHMED2024Learning,
title = {Learning from the few: Fine-grained approach to pediatric wrist pathology recognition on a limited dataset},
author = {Ammar Ahmed and Ali Shariq Imran and Zenun Kastrati and Sher Muhammad Daudpota and Mohib Ullah and Waheed Noor},
journal = {Computers in Biology and Medicine},
volume = {181},
pages = {109044},
year = {2024},
issn = {0010-4825},
doi = {https://doi.org/10.1016/j.compbiomed.2024.109044},
url = {https://www.sciencedirect.com/science/article/pii/S0010482524011296},
}
Acknowledgment
Thanks to PIM for their base Pytorch implementation of Plug-in Module.
This work was supported in part by the Department of Computer Science (IDI), Faculty of Information Technology and Electrical Engineering, Norwegian University of Science and Technology (NTNU), Gjøvik, Norway; and in part by the Curricula Development and Capacity Building in Applied Computer Science for Pakistani Higher Education Institutions (CONNECT), Project number: NORPART-2021/10502, funded by DIKU.
Owner
- Name: Ammar Ahmed
- Login: ammarlodhi255
- Kind: user
- Location: Sukkur, Pakistan
- Website: https://www.youtube.com/channel/UCAh8QVO85NLQGj_RhYoTU1w/videos
- Repositories: 9
- Profile: https://github.com/ammarlodhi255
A computer scientist at heart, interested in AI, software development, and space.
GitHub Events
Total
Last Year
Committers
Last synced: 7 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| ammarlodhi255 | a****8@g****m | 22 |
Issues and Pull Requests
Last synced: 7 months ago
All Time
- Total issues: 0
- Total pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 0
- Total pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- matplotlib ==3.3.1
- numpy ==1.20.2
- opencv-python ==4.5.2.54
- pillow ==8.2.0
- pip ==21.1.3
- seaborn ==0.11.0
- timm ==0.5.4
- torch ==1.9.0
- torchvision ==0.10.0
- wandb ==0.12.4
- _libgcc_mutex 0.1
- _openmp_mutex 4.5
- blas 1.0
- bzip2 1.0.8
- ca-certificates 2021.10.8
- certifi 2021.10.8
- cudatoolkit 11.3.1
- ffmpeg 4.3
- freetype 2.11.0
- giflib 5.2.1
- gmp 6.2.1
- gnutls 3.6.15
- intel-openmp 2021.4.0
- jpeg 9d
- lame 3.100
- lcms2 2.12
- ld_impl_linux-64 2.35.1
- libffi 3.3
- libgcc-ng 9.3.0
- libgfortran-ng 7.5.0
- libgfortran4 7.5.0
- libgomp 9.3.0
- libiconv 1.15
- libidn2 2.3.2
- libpng 1.6.37
- libstdcxx-ng 9.3.0
- libtasn1 4.16.0
- libtiff 4.2.0
- libunistring 0.9.10
- libuv 1.40.0
- libwebp 1.2.0
- libwebp-base 1.2.0
- lz4-c 1.9.3
- mkl 2021.4.0
- mkl-service 2.4.0
- mkl_fft 1.3.1
- mkl_random 1.2.2
- ncurses 6.3
- nettle 3.7.3
- numpy 1.21.2
- numpy-base 1.21.2
- olefile 0.46
- openh264 2.1.0
- openssl 1.1.1l
- patsy 0.5.2
- pillow 8.4.0
- pip 21.2.4
- python 3.8.12
- python-dateutil 2.8.2
- python_abi 3.8
- pytorch 1.10.0
- pytorch-mutex 1.0
- pytz 2021.3
- readline 8.1
- setuptools 58.0.4
- six 1.16.0
- sqlite 3.36.0
- statsmodels 0.12.1
- tk 8.6.11
- torchaudio 0.10.0
- torchvision 0.11.1
- typing_extensions 3.10.0.2
- wheel 0.37.0
- xz 5.2.5
- zlib 1.2.11
- zstd 1.4.9