An open API service indexing awesome lists of open source software.

https://github.com/attentionmech/mav

Model Activity Visualiser
https://github.com/attentionmech/mav

large-language-models machine-learning visualization

Last synced: 3 months ago
JSON representation

Model Activity Visualiser

Awesome Lists containing this project

README

          


MAV Logo




PyPI
Python Versions
PyPI - Downloads
GitHub Repo stars
Build Status
License

# MAV - Model Activity Visualiser

> **Visualize the internal workings of Large Language Models as they generate text**


MAV Demo

## 🚀 Getting Started

### Method 1: Using `uv` (Recommended)

```sh
# Run with PyPI package
uv run --with openmav mav

# Or run directly from GitHub
uv run --with git+https://github.com/attentionmech/mav mav --model gpt2 --prompt "hello mello"
```

**Note**: You can replace `gpt2` with any other Hugging Face model compatible with transformers:

- `HuggingFaceTB/SmolLM-135M`
- `gpt2-medium`
- `gpt2-large`
- `meta-llama/Llama-3.2-1B`

For gated repos, ensure you have done `huggingface-cli login` and your environment has access to it.

### Method 2: Using `pip`

1. Set up and activate a virtual environment
2. Install the package:
```sh
# From PyPI
pip install openmav

# Or from GitHub
pip install git+https://github.com/attentionmech/mav
```
3. Run the visualizer:
```sh
mav --model gpt2 --prompt "hello mello"
```
4. Or import in your code:
```python
from openmav.mav import MAV
MAV("gpt2", "Hello")
```

### Method 3: Local Development

1. Clone the repository:
```sh
git clone https://github.com/attentionmech/mav
cd mav
```
2. Set up and activate a virtual environment
3. Install in development mode:
```sh
pip install .
```
4. Run the visualizer:
```sh
mav --model gpt2 --prompt "hello mello"
```

### Method 4: Jupyter Notebook/Colab

Open In Colab

## 📚 Documentation & Tutorials

### Documentation

Check out the [documentation.md](documentation.md) file for detailed information.

### Tutorials

#### Custom Plugin Development
Writing Custom Plugin Panel

#### Advanced Usage Examples

```sh
# Run MAV with a training loop and custom model
uv run examples/test_vis_train_loop.py

# Run with custom panel configuration
uv run --with git+https://github.com/attentionmech/mav mav \
--model gpt2 \
--num-grid-rows 3 \
--selected-panels generated_text attention_entropy top_predictions \
--max-bar-length 20 \
--refresh-rate 0 \
--max-new-tokens 10000
```

## 🎥 Demos

- [Basic plugins](https://x.com/attentionmech/status/1906769030540824963)
- [Interactive mode](https://x.com/attentionmech/status/1905732784314081511)
- [Limit characters](https://x.com/attentionmech/status/1905760510445850709)
- [Sample with temperature](https://x.com/attentionmech/status/1905886861245259857)
- [Running with custom model](https://x.com/attentionmech/status/1906172982294376755)
- [Panel selection](https://x.com/attentionmech/status/1906304032798339124)
- [Running in Colab notebook](https://x.com/attentionmech/status/1906657159355789593)

> **Note**: Explore additional options using the command line help, as many sampling parameters are exposed.

## 👥 Contributing

Clone the repository and install the package in development mode:

```sh
git clone https://github.com/attentionmech/mav
cd mav

# Using uv (recommended)
uv sync

# Or using pip
pip install -e .
```

## 📝 Citation

```
@article{attentionmech2025openmav,
title={OpenMAV: Model Activity Visualiser},
author={attentionmech},
year={2025}
}
```

## 🧠 Trivia

This project started from a small tweet while testing a simple terminal ui loop: tweet

## ⭐ Star History

[![Star History Chart](https://api.star-history.com/svg?repos=attentionmech/mav&type=Timeline)](https://www.star-history.com/#attentionmech/mav&Timeline)