Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/ashutosh1919/explainable-cnn

📦 PyTorch based visualization package for generating layer-wise explanations for CNNs.
https://github.com/ashutosh1919/explainable-cnn

cnn cnn-classification cnn-pytorch cnn-visualization cnn-visualization-technique explainable-ai explainable-ml grad-cam grad-cam-visualization guided-backpropagation guided-grad-cam pytorch pytorch-implementation saliency-detection saliency-map visualization

Last synced: about 1 month ago
JSON representation

📦 PyTorch based visualization package for generating layer-wise explanations for CNNs.

Awesome Lists containing this project

README

        

# Explainable CNNs
[![Torch Version](https://img.shields.io/badge/torch>=1.10.0-61DAFB.svg?style=flat-square)](#torch) [![Torchvision Version](https://img.shields.io/badge/torchvision>=0.2.2-yellow.svg?style=flat-square)](#torchvision) [![Python Version](https://img.shields.io/badge/python->=3.6-blue.svg?style=flat-square)](#python) [![test workflow](https://github.com/ashutosh1919/explainable-cnn/blob/main/.github/workflows/test_workflow.yml/badge.svg)](#test_workflow) [![Price](https://img.shields.io/badge/price-free-ff69b4.svg?style=flat-square)](#price) [![Maintained](https://img.shields.io/badge/maintained-yes-green.svg?style=flat-square)](#maintained)

**📦 Flexible visualization package for generating layer-wise explanations for CNNs.**

It is a common notion that a Deep Learning model is considered as a black box. Working towards this problem, this project provides flexible and easy to use `pip` package `explainable-cnn` that will help you to create visualization for any `torch` based CNN model. Note that it uses one of the data centric approach. This project focusses on making the internal working of the Neural layers more transparent. In order to do so, `explainable-cnn` is a plug & play component that visualizes the layers based on on their gradients and builds different representations including Saliency Map, Guided BackPropagation, Grad CAM and Guided Grad CAM.

## Architechture




:star: Star us on GitHub — it helps!

## Usage

Install the package

```bash
pip install explainable-cnn
```

To create visualizations, create an instance of `CNNExplainer`.

```python
from explainable_cnn import CNNExplainer

x_cnn = CNNExplainer(...)
```

The following method calls returns `numpy` arrays corresponding to image for different types of visualizations.

```python
saliency_map = x_cnn.get_saliency_map(...)

grad_cam = x_cnn.get_grad_cam(...)

guided_grad_cam = x_cnn.get_guided_grad_cam(...)
```

To see full list of arguments and their usage for all methods, please refer to this file


You may want to look at example usage in the example notebook.

## Output

Below is a comparison of the visualization generated between GradCam and GuidedGradCam



## Contributors ✨

Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):



Ashutosh Hathidara

💻 🎨 🔬 🚧 ⚠️

Lalit Pandey

🔬 📖

This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!

## References

- [Grad-CAM: Visual Explanations from Deep Networks via Gradient-based Localization](https://arxiv.org/pdf/1610.02391.pdf)
- [Grad CAM demonstrations in PyTorch](https://github.com/kazuto1011/grad-cam-pytorch)