Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ivangabriele/docker-functionary
Ready-to-deploy Docker image for Functionary LLM served as an OpenAI-Compatible API.
https://github.com/ivangabriele/docker-functionary
ai docker docker-hub docker-image functionary functions large-language-models llama2 llm openai openai-api server vllm
Last synced: 11 days ago
JSON representation
Ready-to-deploy Docker image for Functionary LLM served as an OpenAI-Compatible API.
- Host: GitHub
- URL: https://github.com/ivangabriele/docker-functionary
- Owner: ivangabriele
- License: agpl-3.0
- Created: 2023-10-14T12:24:59.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-05-26T02:44:29.000Z (5 months ago)
- Last Synced: 2024-10-23T11:57:48.129Z (15 days ago)
- Topics: ai, docker, docker-hub, docker-image, functionary, functions, large-language-models, llama2, llm, openai, openai-api, server, vllm
- Language: Dockerfile
- Homepage:
- Size: 29.3 KB
- Stars: 5
- Watchers: 1
- Forks: 1
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.md
- Code of conduct: CODE_OF_CONDUCT.md
- Security: SECURITY.md
Awesome Lists containing this project
README
# Functionary Docker Image
[![img-github]][link-github]
[![img-docker]][link-docker]
[![img-runpod]][link-runpod]Ready-to-deploy Docker image including the [Functionary LLM][link-functionary] served via an OpenAI-Compatible API.
> [!WARNING]
> This LLM model supports Function Calling. **However** It needs some hacking on the client-side
> because Functionary vLLm script doesn't respect Open-AI API Function Calling format.
> I will provide an example project on how to achieve that in the coming days.> [!IMPORTANT]
> There is no possibility to set an API Key at the moment.
> But I will work on adding that option as an environment variable.
> In the meantime, don't share your endpoint with anybody.## Specifications
### API
- Default Port: `8000`
- Path: `/v1`Example config for your OpenAI-compatible client (here using a RunPod endpoint):
```json
{
"model": "musabgultekin/functionary-7b-v1",
"api_base": "https://[YOUR_CONTAINER_ID]-8000.proxy.runpod.net/v1",
"api_key": "functionary", // Dummy API Key since it can't be `null`
"api_type": "open_ai"
}
```### Docker Image
#### Base
- [nvidia/pytorch](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch)
#### Content
- Python 3.9
- [Functionary LLM](https://github.com/MeetKai/functionary)
- [vLLM Server](https://github.com/vllm-project/vllm)## Deployment
### RunPod
[![img-runpod]][link-runpod]
---
[img-docker]: https://img.shields.io/docker/pulls/ivangabriele/functionary?style=for-the-badge
[img-runpod]: https://img.shields.io/badge/RunPod-Deploy-673ab7?style=for-the-badge
[img-github]: https://img.shields.io/badge/Github-Repo-black?logo=github&style=for-the-badge
[img-github-actions]: https://img.shields.io/github/actions/workflow/status/ivangabriele/docker-functionary/main.yml?branch=main&style=for-the-badge[link-docker]: https://hub.docker.com/r/ivangabriele/functionary
[link-functionary]: https://github.com/MeetKai/functionary
[link-github]: https://github.com/ivangabriele/docker-functionary
[link-github-actions]: https://github.com/ivangabriele/docker-functionary/actions/workflows/main.yml
[link-runpod]: https://runpod.io/gsc?template=sihvefhjru&ref=s0k66ov1