Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/lckr/minmamba
Minimal PyTorch reimplementation of the Mamba architecture a la Kaparthys minGPT
https://github.com/lckr/minmamba
Last synced: 2 days ago
JSON representation
Minimal PyTorch reimplementation of the Mamba architecture a la Kaparthys minGPT
- Host: GitHub
- URL: https://github.com/lckr/minmamba
- Owner: lckr
- License: mit
- Created: 2024-02-24T17:25:40.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2024-06-22T14:07:20.000Z (5 months ago)
- Last Synced: 2024-10-18T19:10:35.739Z (28 days ago)
- Language: Python
- Homepage:
- Size: 10.7 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# minMamba
A simple PyTorch re-implementation of [Mamba](https://github.com/state-spaces/mamba) in a single file. minMamba tries to be small, clean, interpretable and educational.
### Library Installation
If you want to `import minmamba` into your project:
```
git clone https://github.com/lckr/minMamba.git
cd minMamba
pip install -e .
```### Usage
Here's how you'd load a pretrained Mamba model from Huggingface Hub:
```python
import minmamba.model
pretrained_model = minmamba.model.MambaLMModel.from_pretrained("state-spaces/mamba-130m")
```And here's how you'd run inference with it:
```python
from transformers import AutoTokenizertokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b") # tokenizer used by "state-spaces/mamba-130m"
input_seq = tokenizer("A fish is a ", return_tensors="pt")["input_ids"]
gen_seq = pretrained_model.generate(input_seq, 100)
print(tokenizer.decode(gen_seq[0]))
```### References
Code:
- [state-spaces/mamba](https://github.com/state-spaces/mamba
) the official Mamba implementation released by the authors
- [karpathy/minGPT](https://github.com/karpathy/minGPT) Andrej Kaparthys mingpt
- [rjb7731/nanoMamba](https://github.com/rjb7731/nanoMamba) Ryan Bradys nanoMamba implementationPaper:
> **Mamba: Linear-Time Sequence Modeling with Selective State Spaces**\
> Albert Gu*, Tri Dao*\
> Paper: https://arxiv.org/abs/2312.00752### License
MIT