Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/EleutherAI/aria
https://github.com/EleutherAI/aria
Last synced: 3 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/EleutherAI/aria
- Owner: EleutherAI
- License: apache-2.0
- Created: 2023-05-19T17:34:59.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-07-18T13:22:20.000Z (4 months ago)
- Last Synced: 2024-07-18T16:39:37.973Z (4 months ago)
- Language: Python
- Size: 417 KB
- Stars: 38
- Watchers: 3
- Forks: 7
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# gpt-aria
[Discord](https://discord.com/invite/zBGx3azzUn)
A repository containing resources for pre-training, fine-tuning, and evaluating musical (MIDI) transformer models.
***Note that this project is under active development***
## Description
The main goal of the gpt-aria project is to create a suite of powerful pre-trained generative (symbolic) music models. We want to investigate how modern training (pre-training & fine-tuning) techniques can be used to improve the quality/usefulness of such models. Alongside this we are building various data (MIDI) preprocessing tools, allowing **you** to easily fine-tune our models on your own data.
If you are new to symbolic music models, a good place to start are the following projects/blogposts by Google Magenta and OpenAI:
- [Music Transformer](https://magenta.tensorflow.org/music-transformer)
- [MuseNet](https://openai.com/research/musenet)Long story short: Transformer + MIDI + GPUs = 🎵 x ∞
## Installation
Make sure you are using Python 3.10+. Note that I haven't explicitly developed this project for anything other than Linux. If you are using Windows, things might not work properly. In this case I suggest installing using WSL.
```
git clone https://github.com/eleutherai/aria
cd aria
pip install -e .
```## Inference
You can find preliminary checkpoints at the following locations
Finetuned piano-only checkpoints (improved robustness):
```
large - https://storage.googleapis.com/aria-checkpoints/large-abs-inst.safetensors
```Pretrained checkpoints:
```
large - https://storage.googleapis.com/aria-checkpoints/large-abs-pt.bin
medium - https://storage.googleapis.com/aria-checkpoints/medium-abs-pt.bin
small - https://storage.googleapis.com/aria-checkpoints/small-abs-pt.bin
```You can then sample using the cli:
```
aria sample \
-m large \
-c \
-p \
-var \
-trunc \
-l \
-temp 0.95 \
-e
```You can use `aria sample -h` to see a full list of options. If you wish to sample from a pretrained checkpoint, please use the `-pt` flag.