Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dino-kupinic/ai-backend
โจ ai backend for your app powered by llama3
https://github.com/dino-kupinic/ai-backend
fastapi llama3 meta-ai ollama python3
Last synced: about 1 month ago
JSON representation
โจ ai backend for your app powered by llama3
- Host: GitHub
- URL: https://github.com/dino-kupinic/ai-backend
- Owner: Dino-Kupinic
- License: mit
- Created: 2024-05-22T06:32:31.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2024-10-21T06:54:24.000Z (2 months ago)
- Last Synced: 2024-10-21T09:35:09.736Z (2 months ago)
- Topics: fastapi, llama3, meta-ai, ollama, python3
- Language: Python
- Homepage:
- Size: 519 KB
- Stars: 4
- Watchers: 1
- Forks: 0
- Open Issues: 7
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
README
# AI Backend
> [!CAUTION]
> AI Backend is still in Development. You will find bugs and broken/unfinished features.## ๐ Overview
ai-backend is a backend for AI-powered applications. It leverages FastAPI and Ollama to provide a robust API for natural language processing tasks.
## ๐ Installation and Configuration
### Prerequisites
- Python 3.12
- pip
- git### Installation for Development
1. Clone the repository
```bash
git clone https://github.com/Dino-Kupinic/ai-backend.git
```2. Install dependencies
```bash
pip install -r requirements.txt
```3. Create a `.env` file in the root directory and copy over the fields from the `.env.example` file.
4. Download ollama for your system from [here](https://ollama.com/download).
> [!NOTE]
> In the future, ollama will be downloaded from the command line automatically.5. Run the server
```bash
fastapi dev src/main.py
```## ๐ Documentation
### OpenAPI Documentation
The OpenAPI documentation is available at `/docs`. It is automatically generated from the code.
### Configuration
// WIP
### Usage
```bash
curl -X POST "http://localhost:8000/message/" -H "Content-Type: application/json" -d '{"text": "Tell me something about Vienna, Austria"}' --no-buffer
```> [!TIP]
> `--no-buffer` is needed due to streaming.// WIP
## ๐งช Testing
To run the test suite:
1. Ensure that both the AI Backend and Ollama services are running.
2. Execute the following command:```bash
pytest
```This will run all tests in the `tests/` directory.
## ๐ Contributing
// WIP
## ๐ Resources
- [FastAPI](https://fastapi.tiangolo.com/)
- [Ollama](https://ollama.com/)
- [Pydantic](https://pydantic-docs.helpmanual.io/)## ๐ License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## ๐ Acknowledgements
- Special thanks to the FastAPI and Ollama communities for their excellent tools and documentation
---
For more information, please [open an issue](https://github.com/Dino-Kupinic/ai-backend/issues) or contact the maintainers.