https://github.com/simonw/llm-fireworks
Access fireworks.ai models via API
https://github.com/simonw/llm-fireworks
Last synced: 3 months ago
JSON representation
Access fireworks.ai models via API
- Host: GitHub
- URL: https://github.com/simonw/llm-fireworks
- Owner: simonw
- License: apache-2.0
- Created: 2024-04-18T23:27:22.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-04-18T23:29:00.000Z (about 1 year ago)
- Last Synced: 2024-10-18T07:53:37.956Z (9 months ago)
- Language: Python
- Homepage:
- Size: 6.84 KB
- Stars: 9
- Watchers: 1
- Forks: 3
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# llm-fireworks
[](https://pypi.org/project/llm-fireworks/)
[](https://github.com/simonw/llm-fireworks/releases)
[](https://github.com/simonw/llm-fireworks/actions/workflows/test.yml)
[](https://github.com/simonw/llm-fireworks/blob/main/LICENSE)Access [fireworks.ai](https://fireworks.ai/) models via API
## Installation
Install this plugin in the same environment as [LLM](https://llm.datasette.io/).
```bash
llm install llm-fireworks
```
## UsageObtain a [Fireworks API key](https://fireworks.ai/api-keys) and save it like this:
```bash
llm keys set fireworks
#
```
Run `llm models` to get a list of models.Run prompts like this:
```bash
llm -m fireworks/models/llama-v3-70b-instruct 'five great names for a pet ocelot'
```## Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
```bash
cd llm-fireworks
python3 -m venv venv
source venv/bin/activate
```
Now install the dependencies and test dependencies:
```bash
llm install -e '.[test]'
```
To run the tests:
```bash
pytest
```