Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/tomconte/llm_openapi_tools
LLM + OpenAPI 😎
https://github.com/tomconte/llm_openapi_tools
Last synced: 12 days ago
JSON representation
LLM + OpenAPI 😎
- Host: GitHub
- URL: https://github.com/tomconte/llm_openapi_tools
- Owner: tomconte
- License: mit
- Created: 2024-08-03T15:57:47.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2024-08-03T16:00:23.000Z (4 months ago)
- Last Synced: 2024-10-26T09:46:21.769Z (25 days ago)
- Language: Python
- Size: 45.9 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# LLM OpenAPI Tools
This project is a collection of tools to work with Function Calling in Large Language Models (LLMs) such as OpenAI's GPT.
## Enriching OpenAPI specs for LLMs
The `llm_openapi_tools.openapi` module provides tools to enrich and specialize OpenAPI specs with additional information that can be used by LLMs to generate more accurate function calls. It uses a manifest file to configure the specialization process.
We call the resulting enriched OpenAPI spec an "LLM-friendly" spec. These specs can be thought of as a "plugin" for the LLM, providing additional information that can be used to generate function calls.
To convert an OpenAPI spec to an LLM-friendly spec from the command line, use the `llm_openapi_tools.convert` module:
```bash
python -m llm_openapi_tools.convert openapi.json manifest.json output.json
```The following example specs and manifest are provided in the `tests/__fixtures__` directory:
- `openapi_specs/petstore-v3-openapi.json`: A sample OpenAPI spec for the Petstore API.
- `manifests/petstore-v3.yaml`: A sample manifest file for the Petstore API.
- `llm_friendly_specs/petstore-v3-openapi-plugin.json`: The output of the conversion process.### Examples
A number of examples are provided to show how to use the specialized OpenAPI specs to generate function calls, using various AI frameworks such as LangChain or Semantic Kernel. They can help to compare the behaviour of the LLM when using the original spec vs. the specialized one.
These examples are located in the `samples` directory. See the README there for more information.