Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

https://github.com/rjmacarthy/twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://github.com/rjmacarthy/twinny

artificial-intelligence code-chat code-completion code-generation codellama copilot free llama2 llamacpp ollama ollama-api ollama-chat private vscode-extension

Last synced: 21 days ago
JSON representation

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.

Lists

README

        

# twinny

Free and private AI extension for Visual Studio Code.

- [Ollama](https://github.com/jmorganca/ollama)
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
- [oobabooga/text-generation-webui](https://github.com/oobabooga/text-generation-webui)
- [LM Studio](https://github.com/lmstudio-ai)
- [LiteLLM](https://github.com/BerriAI/litellm)
- [Open WebUI](https://github.com/open-webui/open-webui)

## 🚀 Getting Started

Visit the [quick start guide](https://rjmacarthy.github.io/twinny-docs/general/quick-start/) to get started.

## Main Features

### Fill in the Middle Code Completion

Get AI-based suggestions in real time. Let Twinny autocomplete your code as you type.

![Fill in the Middle Example](https://github.com/rjmacarthy/twinny/assets/5537428/69f567c0-2700-4474-b621-6099255bc87b)

### Chat with AI About Your Code

Discuss your code via the sidebar: get function explanations, generate tests, request refactoring, and more.

### Additional Features

- Operates online or offline
- Highly customizable API endpoints for FIM and chat
- Chat conversations are preserved
- Conforms to the OpenAI API standard
- Supports single or multiline fill-in-middle completions
- Customizable prompt templates
- Generate git commit messages from staged changes
- Easy installation via the Visual Studio Code extensions marketplace
- Customizable settings for API provider, model name, port number, and path
- Compatible with Ollama, llama.cpp, oobabooga, and LM Studio APIs
- Accepts code solutions directly in the editor
- Creates new documents from code blocks
- Copies generated code solution blocks

## Known Issues

Visit the GitHub [issues page](https://github.com/rjmacarthy/twinny/issues) for known problems and troubleshooting.

## Contributing

Interested in contributing? Reach out on [Twitter](https://x.com/rjmacarthy), describe your changes in an issue, and submit a PR when ready. Twinny is open-source under the MIT license. See the [LICENSE](https://github.com/rjmacarthy/twinny/blob/master/LICENSE) for more details.

## Support Twinny

Thanks for using Twinny!

This project is and will always be free and open source. If you find it helpful, please consider showing your appreciation with a small donation <3

Bitcoin: `1PVavNkMmBmUz8nRYdnVXiTgXrAyaxfehj`

## Disclaimer

Twinny is actively developed and provided "as is". Functionality may vary between updates.

## Star History



Star History Chart