https://github.com/zsxkib/cog-aya-101
Aya Model: An Instruction Finetuned Open-Access Multilingual Language Model
https://github.com/zsxkib/cog-aya-101
Last synced: 8 months ago
JSON representation
Aya Model: An Instruction Finetuned Open-Access Multilingual Language Model
- Host: GitHub
- URL: https://github.com/zsxkib/cog-aya-101
- Owner: zsxkib
- License: apache-2.0
- Created: 2024-02-13T14:45:13.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-02-15T14:35:39.000Z (over 1 year ago)
- Last Synced: 2025-02-06T08:46:02.098Z (8 months ago)
- Language: Python
- Size: 8.79 KB
- Stars: 4
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# cog-aya-101
ЁЯУЪ Aya, an LLM by Cohere capable of understanding and generating content in 101 languages ЁЯЧгя╕П[](https://replicate.com/zsxkib/aya-101)
This is an implementation of [CohereForAI/aya-101](https://huggingface.co/CohereForAI/aya-101) as a Cog model. [Cog packages machine learning models as standard containers.](https://github.com/replicate/cog)
Simply run:
```sh
$ cog predict -i prompt="рднрд╛рд░рдд рдореЗрдВ рдЗрддрдиреА рд╕рд╛рд░реА рднрд╛рд╖рд╛рдПрдБ рдХреНрдпреЛрдВ рд╣реИрдВ?"рднрд╛рд░рдд рдореЗрдВ рдХрдИ рднрд╛рд╖рд╛рдПрдБ рд╣реИрдВ рдФрд░ рд╡рд┐рднрд┐рдиреНрди рднрд╛рд╖рд╛рдУрдВ рдХреЗ рдмреЛрд▓реА рдЬрд╛рдиреЗ рд╡рд╛рд▓реЗ рд▓реЛрдЧ рд╣реИрдВред рдпрд╣ рд╡рд┐рднрд┐рдиреНрдирддрд╛ рднрд╛рд╖рд╛рдИ рд╡рд┐рд╡рд┐рдзрддрд╛ рдФрд░ рд╕рд╛рдВрд╕реНрдХреГрддрд┐рдХ рд╡рд┐рд╡рд┐рдзрддрд╛ рдХрд╛ рдкрд░рд┐рдгрд╛рдо рд╣реИред
```This will automagically download the weights too.
тЪая╕П **Important Notice:**
Before attempting to run the `aya-101` model, please be aware that the model weights are extremely large. This may significantly impact download times and require substantial disk space. Ensure your system is adequately prepared to handle this load. For detailed requirements and potential impact, refer to the discussion [here](https://huggingface.co/CohereForAI/aya-101/discussions/7). This model was tested on an Nvidia A40 Large w/ 64GB RAM.