bioexp

Explainability of Deep Learning Models

https://github.com/koriavinash1/bioexp

Science Score: 10.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.1%) to scientific vocabulary

Keywords

ablation activation-maximization causal-inference causality clustering concept-extraction dissection explainability explainable-ai gradcam interpretability interpretable-ai intervention joint-analysis rct uncertainty
Last synced: 6 months ago · JSON representation

Repository

Explainability of Deep Learning Models

Basic Info
  • Host: GitHub
  • Owner: koriavinash1
  • License: mit
  • Language: Python
  • Default Branch: master
  • Homepage:
  • Size: 542 MB
Statistics
  • Stars: 30
  • Watchers: 5
  • Forks: 5
  • Open Issues: 6
  • Releases: 1
Topics
ablation activation-maximization causal-inference causality clustering concept-extraction dissection explainability explainable-ai gradcam interpretability interpretable-ai intervention joint-analysis rct uncertainty
Created almost 6 years ago · Last pushed almost 3 years ago
Metadata Files
Readme Funding License Code of conduct

README.md

BioExp

Build Status Documentation Status PyPI version Downloads arXiv arXiv License: MIT

Explaining Deep Learning Models which perform various image processing tasks in the medical images and natural images.

Features

  • [x] Dissection Analysis
  • [x] Ablation Analysis
  • [x] Uncertainity Analysis
    • [x] Epistemic Uncertainty using Bayesian Dropout
    • [x] Aleatoric Uncertainty using Test Time Augmentation
  • [x] Activation Maximization
  • [x] CAM Analysis
  • [x] RCT on input and concept space
  • [x] Concept generation clustering analysis
    • [x] wts based clustering
    • [x] feature based clustering
  • [x] Concept Identification
    • [x] Dissection based
    • [x] Flow based
  • [x] Causal Graph
  • [x] Inference Methods
  • [ ] Counterfactuals on Visual Trails
  • [ ] Counterfactual Generation
  • [ ] Ante-hoc methods (Meta-Causation)

Citations

If you use BioExp, please cite the following papers:

``` @article{kori2020abstracting, title={Abstracting Deep Neural Networks into Concept Graphs for Concept Level Interpretability}, author={Kori, Avinash and Natekar, Parth and Krishnamurthi, Ganapathy and Srinivasan, Balaji}, journal={arXiv preprint arXiv:2008.06457}, year={2020} }

@article{natekar2020demystifying, title={Demystifying Brain Tumor Segmentation Networks: Interpretability and Uncertainty Analysis}, author={Natekar, Parth and Kori, Avinash and Krishnamurthi, Ganapathy}, journal={Frontiers in Computational Neuroscience}, volume={14}, pages={6}, year={2020}, publisher={Frontiers} } ```

Defined Pipeline

pipeline

Installation

Running of the explainability pipeline requires a GPU and several deep learning modules.

Requirements

  • 'pandas'
  • 'numpy'
  • 'scipy==1.6.0'
  • 'matplotlib'
  • 'pillow'
  • 'simpleITK'
  • 'opencv-python'
  • 'tensorflow-gpu==1.14'
  • 'keras'
  • 'keras-vis'
  • 'lucid'

The following command will install only the dependencies listed above.

pip install BioExp

Ablation

Usage

``` from BioExp.spatial import Ablation

A = spatial.Ablation(model = model, weightspth = weightspath, metric = dicelabelcoef, layername = layername, testimage = testimage, gt = gt, classes = infoclasses, nclasses = 4)

df = A.ablate_filter(step = 1) ```

Dissection

Usage

``` from BioExp.spatial import Dissector

layername = 'conv2d3' infoclasses = {} for i in range(1): infoclasses['class_'+str(i)] = (i,) infoclasses['whole'] = (1,2,3)

dissector = Dissector(model=model, layername = layername)

thresholdmaps = dissector.getthresholdmaps(datasetpath = datarootpath, savepath = savepath, percentile = 85) dissector.applythreshold(image, thresholdmaps, nfeatures =9, savepath = savepath, ROI = ROI)

dissector.quantifygtfeatures(image, gt, thresholdmaps, nclasses = infoclass, nfeatures = 9, savepath = savepath, save_fmaps = False, ROI = ROI) ```

Results

dissection

GradCAM

Usage

``` from BioExp.spatial import cam

dice = flow.cam(model, img, gt, nclasses = nclasses, savepath = savepath, layer_idx = -1, threshol = 0.5, modifier = 'guided')

```

Results

gradcam

Activation Maximization

Usage

``` from BioExp.concept.feature import Feature_Visualizer

class Load_Model(Model):

modelpath = '../../savedmodels/modelflairscaled/model.pb' imageshape = [None, 1, 240, 240] imagevaluerange = (0, 10) inputname = 'input_1'

E = FeatureVisualizer(LoadModel, savepath = '../results/', regularizerparams={'L1':1e-3, 'rotate':8}) a = E.run(layer = 'conv2d17', class_ = 'None', channel = 95, transforms=True)

```

Activation Results

lucid

Uncertainty

Usage

``` from BioExp.uncertainty import uncertainty

D = uncertainty(test_image)

for aleatoric

mean, var = D.aleatoric(model, iterations = 50)

for epistemic

mean, var = D.epistemic(model, iterations = 50)

for combined

mean, var = D.combined(model, iterations = 50)

```

Results

un

Radiomics

Usage

from BioExp.helpers import radfeatures feat_extractor = radfeatures.ExtractRadiomicFeatures(image, mask, save_path = pth) df = feat_extractor.all_features()

Causal Inference Pipeline

un

Contact

  • Avinash Kori (koriavinash1@gmail.com)
  • Parth Natekar (parth@smail.iitm.ac.in)

Owner

  • Name: Avinash
  • Login: koriavinash1
  • Kind: user
  • Location: ---
  • Company: ---

Causality and XAI

GitHub Events

Total
  • Watch event: 1
Last Year
  • Watch event: 1

Committers

Last synced: over 1 year ago

All Time
  • Total Commits: 200
  • Total Committers: 5
  • Avg Commits per committer: 40.0
  • Development Distribution Score (DDS): 0.24
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
koriavinash1 k****1@g****m 152
Parth Natekar p****6@g****m 42
dradientgescent 3****t 3
zsfVishnu v****2@g****m 2
Avinash Kori a****e@g****m 1

Issues and Pull Requests

Last synced: 8 months ago

All Time
  • Total issues: 7
  • Total pull requests: 10
  • Average time to close issues: 12 days
  • Average time to close pull requests: 3 months
  • Total issue authors: 1
  • Total pull request authors: 1
  • Average comments per issue: 0.43
  • Average comments per pull request: 1.1
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 10
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • koriavinash1 (7)
Pull Request Authors
  • dependabot[bot] (10)
Top Labels
Issue Labels
enhancement (2) Emergency (2) bug (1)
Pull Request Labels
dependencies (10)

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 18 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 1
  • Total versions: 3
  • Total maintainers: 1
pypi.org: bioexp

Deep Learning model analysis toolbox

  • Versions: 3
  • Dependent Packages: 0
  • Dependent Repositories: 1
  • Downloads: 18 Last month
Rankings
Dependent packages count: 10.1%
Stargazers count: 11.6%
Forks count: 14.2%
Average: 19.6%
Dependent repos count: 21.5%
Downloads: 40.4%
Maintainers (1)
Last synced: 7 months ago

Dependencies

requirements.txt pypi
  • keras ==2.1.5
  • keras-vis *
  • opencv-python *
  • pandas *
  • pillow *
  • ppgm ==0.0.4
  • scipy >=0.19.0
  • simpleitk *
  • sklearn *
  • tensorflow-gpu ==1.14
  • tqdm *
.github/workflows/pythonpackage.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v1 composite
.github/workflows/pythonpublish.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v1 composite