https://github.com/meeehdi-dev/bropilot.nvim
🤖 Neovim code suggestion and completion (just like GitHub Copilot, but locally using Ollama)
https://github.com/meeehdi-dev/bropilot.nvim
completion copilot llm local lua neovim neovim-lua neovim-plugin neovim-plugins ollama suggestion
Last synced: 7 months ago
JSON representation
🤖 Neovim code suggestion and completion (just like GitHub Copilot, but locally using Ollama)
- Host: GitHub
- URL: https://github.com/meeehdi-dev/bropilot.nvim
- Owner: meeehdi-dev
- License: mit
- Created: 2024-05-05T19:42:58.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-03-11T06:26:32.000Z (7 months ago)
- Last Synced: 2025-03-11T07:31:09.754Z (7 months ago)
- Topics: completion, copilot, llm, local, lua, neovim, neovim-lua, neovim-plugin, neovim-plugins, ollama, suggestion
- Language: Lua
- Homepage:
- Size: 232 KB
- Stars: 35
- Watchers: 1
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Bropilot.nvim
Bropilot is a [GitHub Copilot](https://github.com/github/copilot.vim) alternative that takes advantage of local LLMs through [Ollama](https://ollama.com/)'s API.
Current working models:
- qwen2.5-coder
- deepseek-coder
- deepseek-coder-v2
- starcoder2
- codellama
- ~~codegemma~~ (doesn't seem to work anymore... https://github.com/ollama/ollama/issues/4806)
## Setup
You need to have [Ollama](https://ollama.com/) installed and running for bro to work.
[Official download link](https://ollama.com/download)For Linux:
```sh
curl -fsSL https://ollama.com/install.sh | sh
# And check that the service is running
systemctl status ollama
```## Configuration
Here is the default configuration.
- `auto_suggest` is a boolean that enables automatic debounced suggestions
- `excluded_filetypes` is an array of filetypes ignored by the `auto_suggest` option (https://github.com/meeehdi-dev/bropilot.nvim/pull/1)
- `model` is a string (e.g. "codellama:7b-code" or "codegemma:2b-code")
- `model_params` is an optional table defining model params as per [Ollama API params](https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values)
- `prompt` is a table defining the `prefix`, `suffix` and `middle` keywords for FIM
- `debounce` is a number in milliseconds (this value gradually increases as long as curl does not respond to avoid overload issues)
- `keymap` is a table to set the different keymap shortcuts *(not using lazy keys to allow fallback to default behavior when suggestions are not active)*```lua
require('bropilot').setup({
auto_suggest = true,
excluded_filetypes = {},
model = "qwen2.5-coder:0.5b-base",
model_params = {
num_ctx = 32768,
num_ctx = 8192,
num_predict = -2,
temperature = 0.2,
top_p = 0.95,
stop = { "<|fim_pad|>", "<|endoftext|>" },
},
prompt = {
prefix = "<|fim_prefix|>",
suffix = "<|fim_suffix|>",
middle = "<|fim_middle|>",
},
debounce = 500,
keymap = {
accept_word = "",
accept_line = "",
accept_block = "",
suggest = "",
},
ollama_url = "http://localhost:11434/api",
})
```## Usage
Install and configure using [lazy.nvim](https://github.com/folke/lazy.nvim)
```lua
{
'meeehdi-dev/bropilot.nvim',
event = "VeryLazy", -- preload model on start
dependencies = {
"nvim-lua/plenary.nvim",
"j-hui/fidget.nvim",
},
config = true, -- setup with default options
}
-- or
{
'meeehdi-dev/bropilot.nvim',
event = "VeryLazy", -- preload model on start
dependencies = {
"nvim-lua/plenary.nvim",
"j-hui/fidget.nvim",
},
opts = {
auto_suggest = true,
model = "starcoder2:3b",
prompt = { -- FIM prompt for starcoder2
prefix = "",
suffix = "",
middle = "",
},
debounce = 500,
keymap = {
accept_line = "",
},
},
config = function (_, opts)
require("bropilot").setup(opts)
end,
}
```