Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/s1m0n38/ai.nvim
Query LLMs following OpenAI API specification ๐
https://github.com/s1m0n38/ai.nvim
ai-nvim curl llm luarocks neovim openai-api
Last synced: 27 days ago
JSON representation
Query LLMs following OpenAI API specification ๐
- Host: GitHub
- URL: https://github.com/s1m0n38/ai.nvim
- Owner: S1M0N38
- License: mit
- Created: 2024-01-13T14:45:36.000Z (12 months ago)
- Default Branch: main
- Last Pushed: 2024-09-14T18:29:49.000Z (4 months ago)
- Last Synced: 2024-09-16T02:26:03.549Z (3 months ago)
- Topics: ai-nvim, curl, llm, luarocks, neovim, openai-api
- Language: Lua
- Homepage:
- Size: 95.7 KB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
โงย ย ai.nvimย ย โง
______________________________________________________________________
## ๐ก Idea
LLM providers offer libraries for the most popular programming languages, so you can build code that interacts with their API.
Generally, those are wrappers around HTTPS requests with a mechanism to handle API responses (e.g., using callbacks).To the best of my knowledge, if you want to build a plugin for Neovim that uses LLM, you have to explicitly make requests using a library like `curl` and take care of requests and responses parsing yourself. This results in a lot of boilerplate code that can be abstracted away.
`ai.nvim` is an experimental library that can be used to build Neovim plugins that interact with LLM providers: it crafts requests, parses responses, invokes callbacks, and handles errors.
## โก๏ธ Requirements
- Neovim โฅ **0.9**
- Curl
- Access to LLM Provider## ๐ Usage
Read the documentation with [`:help ai.nvim`](https://github.com/S1M0N38/ai.nvim/blob/main/doc/ai.txt)
Plugins built with `ai.nvim`:
- [dante.nvim](https://github.com/S1M0N38/dante.nvim) โ A basic writing tool powered by LLM
- *PR your plugin here ...*## โจ LLM Providers
There are many providers that offer LLM models exposing OpenAI-compatible API.
The following is an incomplete list of providers that I have experimented with:| Provider | Models | Base URL |
| :---------------------------------------------------------------------------------------------------------------------- | :---------------------------------------------------------------- | :------------------------------- |
| [OpenAI](https://platform.openai.com/docs/overview) | `gpt-4o`, `gpt-4o-mini` | `https://api.openai.com/v1` |
| [Mistral](https://docs.mistral.ai/) | `mistral-large-latest`, `open-mistral-nemo` | `https://api.mistral.ai/v1` |
| [Groq](https://console.groq.com/docs/quickstart) | `gemma2-9b-it`, `llama-3.1-70b-versatile`, `llama-3.1-8b-instant` | `https://api.groq.com/openai/v1` |
| [Copilot Chat](https://docs.github.com/en/copilot/using-github-copilot/asking-github-copilot-questions-in-your-ide)[^1] | `gpt-3.5-turbo`, `gpt-4o-mini`, `gpt-4o`, `gpt-4-0125-preview` | `https://api.githubcopilot.com` |- If you want to use other providers that do not expose OpenAI-compatible API (e.g., Anthropic, Cohere, ...), you can try [liteLLM](https://docs.litellm.ai/docs/) proxy service.
- If you want to use local models, you can use [Ollama](https://ollama.com/), [llama-cpp](https://github.com/ggerganov/llama.cpp), [vLLM](https://docs.vllm.ai/en/latest/) or others.**There is no future plan to support other API standards besides OpenAI-compatible API.**
## ๐ Acknowledgments
- My Awesome Plugin [template](https://github.com/S1M0N38/my-awesome-plugin.nvim).
- [mrcjkb's blog posts](https://mrcjkb.dev/) about Neovim, Luarocks, and Busted.
- [mrcjkb](https://github.com/mrcjkb) and [vhyrro](https://github.com/vhyrro) repos' for GitHub Actions workflows.[^1]: Copilot Chat is not a proper LLM provider, but a service offered with a Copilot subscription. If you use [copilot.vim](https://github.com/github/copilot.vim) or [copilot.lua](https://github.com/zbirenbaum/copilot.lua), you should have the token stored in one of these locations: `~/AppData/Local/github-copilot`, `$XDG_CONFIG_HOME/github-copilot`, or `~/.config/github-copilot` in a file named `hosts.json` or `apps.json`. That token is used for requesting another token with an expiration time. You can use that second token as `api_key` in ai.nvim configuration. No plan to implement an auto-token refresh mechanism.