explainable-tunnel-rock-classification
Official code and dataset for TUST-D-24-02672
https://github.com/john-wang-0809/explainable-tunnel-rock-classification
Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (13.1%) to scientific vocabulary
Repository
Official code and dataset for TUST-D-24-02672
Basic Info
Statistics
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
Tunnelling and Underground Space Technology + Supplementary Code
Project Introduction
Based on a self-designed backbone network, this project introduces the following innovative improvements for the rock image classification task:
- Introduction of Soft Token Shift Block (self-developed module) to enhance feature representation capability.
- Integration of Cross-Entropy Loss and Label Smoothing Loss to improve model generalization.
- Multi-layer attention heatmap visualization, supporting analysis of attention distribution at each layer of the model.
Environment Requirements
- Python >= 3.8
Dataset Preparation
- Trial link for the dataset used in the paper:
- Place the dataset in the
datasets/directory, or specify the path in the configuration file. - If you need to use your own dataset, you can follow the format of the trial dataset provided in this project.
Usage
- Train the Model
bash
python main.py
To further modify parameters, you can specify them in main.py:
bash
model.train(data="Datasets/data/Rock/image", batch=32,
epochs=2, project='runs/class/train=2', name='ROCKclass',
amp=False,
workers=1,
optimizer='AdamW', # Optimizer
# cos_lr=True, # Cosine LR Scheduler
#lr0=0.0005,
lr0=0.001
#imgsz=480
) # train
- Testing/Evaluation
- After training, you can perform inference and evaluation with the following code:
bash model = YOLO(r"runs/class/train=2/ROCKclass3/weights/best.pt") model.val(data=r"ultralytics/cfg/datasets/mydata.yaml", ch=3, batch=4, workers=1, save_json=True, save_txt=True) # Validation model.predict(source=r"Datasets/data/Rock/test", save=True) # Detection
- Visualization Analysis
bash
python heatmap_cls.py # for single-layer use
python heatmap_cls_multi_layer.py # for multi-layer use
- You can scroll to the end of the above files to select the corresponding method and path:
bash
if __name__ == '__main__':
model = yolov8_heatmap(**get_params())
model(r'Datasets/data/Rock/image/grad/', 'Xresult')
- In addition, you can also select the method and parameters in yolov8heatmap.py:
```bash
def getparams(): # line 162
params = {
'weight': r'runs/detect/train=2/FINALCHECK/weights/best.pt',
'cfg': r'ultralytics/cfg/models/v8/yolov8-twoCSP-CTF-CFE.yaml',
'device': 'cuda:0',
'method': 'GradCAM', # GradCAMPlusPlus, GradCAM, XGradCAM
'layer': 'model.model[9]',
'backwardtype': 'all', # class, box, all
'confthreshold': 0.6, # 0.6
'ratio': 0.02 # 0.02-0.1
}
return params
``
- After running the above code, the model will automatically generate various visualization results. Here are standard examples:
-output.jpg: Standard GradCAM multi-layer visualization
-output++.jpg: Improved GradCAM++ multi-layer visualization
-xoutput.jpg`: XGradCAM multi-layer visualization method
Acknowledgements
- Contributors to related open-source projects and datasets
Owner
- Login: John-Wang-0809
- Kind: user
- Repositories: 1
- Profile: https://github.com/John-Wang-0809
Citation (CITATION.cff)
# This CITATION.cff file was generated with https://bit.ly/cffinit
cff-version: 1.2.0
title: Ultralytics YOLO
message: >-
If you use this software, please cite it using the
metadata from this file.
type: software
authors:
- given-names: Glenn
family-names: Jocher
affiliation: Ultralytics
orcid: 'https://orcid.org/0000-0001-5950-6979'
- given-names: Ayush
family-names: Chaurasia
affiliation: Ultralytics
orcid: 'https://orcid.org/0000-0002-7603-6750'
- family-names: Qiu
given-names: Jing
affiliation: Ultralytics
orcid: 'https://orcid.org/0000-0003-3783-7069'
repository-code: 'https://github.com/ultralytics/ultralytics'
url: 'https://ultralytics.com'
license: AGPL-3.0
version: 8.0.0
date-released: '2023-01-10'
GitHub Events
Total
- Issue comment event: 3
- Push event: 4
- Pull request event: 2
- Create event: 5
Last Year
- Issue comment event: 3
- Push event: 4
- Pull request event: 2
- Create event: 5
Dependencies
- actions/checkout v4 composite
- actions/setup-python v5 composite
- codecov/codecov-action v4 composite
- conda-incubator/setup-miniconda v3 composite
- slackapi/slack-github-action v1.26.0 composite
- contributor-assistant/github-action v2.4.0 composite
- actions/checkout v4 composite
- github/codeql-action/analyze v3 composite
- github/codeql-action/init v3 composite
- actions/checkout v4 composite
- docker/login-action v3 composite
- docker/setup-buildx-action v3 composite
- docker/setup-qemu-action v3 composite
- nick-invision/retry v3 composite
- slackapi/slack-github-action v1.26.0 composite
- ultralytics/actions main composite
- actions/first-interaction v1 composite
- actions/checkout v4 composite
- nick-invision/retry v3 composite
- actions/checkout v4 composite
- actions/setup-python v5 composite
- actions/checkout v4 composite
- actions/setup-python v5 composite
- slackapi/slack-github-action v1.26.0 composite
- actions/stale v9 composite
- pytorch/pytorch 2.3.1-cuda12.1-cudnn8-runtime build
- matplotlib >=3.3.0
- numpy >=1.23.5,<2.0.0
- opencv-python >=4.6.0
- pandas >=1.1.4
- pillow >=7.1.2
- psutil *
- py-cpuinfo *
- pyyaml >=5.3.1
- requests >=2.23.0
- scipy >=1.4.1
- seaborn >=0.11.0
- torch >=1.8.0
- torchvision >=0.9.0
- tqdm >=4.64.0
- ultralytics-thop >=2.0.0