Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/Plazide/local-ai-code-completion

A vscode extension providing AI code completion locally on your computer.
https://github.com/Plazide/local-ai-code-completion

artificial-intelligence developer-tools large-language-models typescript vscode-extension

Last synced: about 2 months ago
JSON representation

A vscode extension providing AI code completion locally on your computer.

Awesome Lists containing this project

README

        

# local-ai-code-completion README

Enables AI Assisted code completion, similar to Github Copilot, completely locally. No code leaves your machine. This has two major benefits:

- **Cost**. This extension is completely free to use.
- **Privacy**. No data is shared with third-parties, everything stays on your computer.

## Features

AI Assisted code completion.

You trigger code completion by pressing Ctrl+Alt+C.

You accept a completion by pressing Tab.

You cancel an ongoing completion by pressing Escape.

You delete a non-accepted completion by pressing Escape.

GIF is sped up.

![usage example](./assets/lacc-example.gif)

The extension uses codellama 7B under the hood, which supports many languages including Python, C++, Java, PHP, Typescript (Javascript), C# and Bash.

According to evaluation results from Meta, codellama 7B is almost on par with Codex, the model used by Github Copilot.

## Requirements

This extension requires an [Ollama](https://ollama.ai/) installation to run the language model locally. Ollama does not currently support Windows, which also means that this extension is not compatible with Windows.

## Known Issues

- Time to start generating can be very long. This is an inherent issue to the model running locally on your computer.
- Inference is slow. Also a consequence of running the model locally, but depends on your system.

## Release Notes

### 1.2.0

#### Added

- Config option for generation timeout
- Config options for baseUrl of Ollama API (enables use of the extension with a remote or local Ollama server)

#### Changed

- Improved logging

#### Fixed

- Bug where aborting generation would not work

Thanks to [@johnnyasantoss](https://github.com/johnnyasantoss) for making these changes.

---

### 1.1.0

Added options for changing model, temperature and top_p parameters.

---

### 1.0.0

Initial release.

---