Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/mirpo/fastapi-gen

Build LLM-enabled FastAPI applications without build configuration.
https://github.com/mirpo/fastapi-gen

cli fastapi gemma-2b huggingface langchain langchain-python llama llama2 llamacpp llm named-entity-recognition ner nlp smollm summarization text-generation

Last synced: 12 days ago
JSON representation

Build LLM-enabled FastAPI applications without build configuration.

Awesome Lists containing this project

README

        

# FastApi Gen

Create LLM-enabled FastAPI applications without build configuration.

Test
Package version
Supported Python versions

---

FastApi Gen works on macOS and Linux.

If something doesn’t work, please [file an issue](https://github.com/mirpo/fastapi-gen/issues/new).

## Quick Overview

```console
pip3 install fastapi-gen
fastapi-gen my_app
cd my_app
make start-dev
```

or

```console
pipx run fastapi-gen my_app
cd my_app
make start-dev
```

If you've previously installed `fastapi-gen` globally via `pip3 install fastapi-gen`, we recommend you reinstall the package using `pip3 install --upgrade --force-reinstall fastapi-gen` or `pipx upgrade fastapi-gen` to ensure that you use the latest version.

Then open http://localhost:8000/docs to see your app OpenAPI documentation.

Available templates:

1. Default - basic template with GET/POST examples.
2. NLP - natural language processing template with examples how to use local Hugginface models for summarization, named-entity recognition and text generation using LLM.
3. Langchain - template with examples how to use LangChain with local Hugginface models (LLMs) for text generation and question answering.
4. Llama - template with examples how to use llama.cpp and llama-cpp-python with local Llama 2 for question asnwering.

*Important notes*:
- Langchain template requires hardware to run and will automatically download required models, be patient.
- Llama template will download around 4GB model from Hugginface and >4GB of RAM.

Each template includes not only code, but also **tests**.

### Get Started Immediately

You **don’t** need to install or configure depencendeices like FastApi or Pytest.

They are preconfigured and hidden so that you can focus on the code.

Create a project, and you’re good to go.

## Creating an App

**You’ll need to have Python 3.7+ or later version on your local development machine**. We recommend using the latest LTS version. You can use [pyenv](https://github.com/pyenv/pyenv) (macOS/Linux) to switch Python versions between different projects.

### Basic template

```console
pip3 install fastapi-gen
fastapi-gen my_app
```

or

```console
pip3 install fastapi-gen
fastapi-gen my_app --template hello_world
```

### NLP template

```console
pip install fastapi-gen
fastapi-gen my_app --template nlp
```

### Langchain template

```console
pip install fastapi-gen
fastapi-gen my_app --template Langchain
```

### Llama template

```console
pip install fastapi-gen
fastapi-gen my_app --template llama
```

Inside the newly created project, you can run some built-in commands:

### `make start`

Runs the app in development mode.

Open [http://localhost:8000/docs](http://localhost:8000/docs) to view OpenAPI documentation in the browser.

The page will automatically reload if you make changes to the code.

### `make test`

Runs tests.

By default, runs tests related to files changed since the last commit.

## License

`fastapi-gen` is distributed under the terms of the [MIT](https://github.com/mirpo/fastapi-gen/blob/master/LICENSE) license.