Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/rjmacarthy/twinny
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://github.com/rjmacarthy/twinny
artificial-intelligence code-chat code-completion code-generation codellama copilot free llama2 llamacpp ollama ollama-api ollama-chat private symmetry vscode-extension
Last synced: 4 days ago
JSON representation
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
- Host: GitHub
- URL: https://github.com/rjmacarthy/twinny
- Owner: twinnydotdev
- License: mit
- Created: 2023-08-21T14:10:35.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-10-18T12:09:18.000Z (30 days ago)
- Last Synced: 2024-10-29T14:51:14.540Z (19 days ago)
- Topics: artificial-intelligence, code-chat, code-completion, code-generation, codellama, copilot, free, llama2, llamacpp, ollama, ollama-api, ollama-chat, private, symmetry, vscode-extension
- Language: TypeScript
- Homepage: https://twinny.dev
- Size: 59.1 MB
- Stars: 3,019
- Watchers: 18
- Forks: 162
- Open Issues: 48
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- Funding: .github/FUNDING.yml
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- awesome-ai-coding - Twinny
README
# twinny
Free and private AI extension for Visual Studio Code.- [Ollama](https://github.com/jmorganca/ollama)
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
- [oobabooga/text-generation-webui](https://github.com/oobabooga/text-generation-webui)
- [LM Studio](https://github.com/lmstudio-ai)
- [LiteLLM](https://github.com/BerriAI/litellm)
- [Open WebUI](https://github.com/open-webui/open-webui)## 🚀 Getting Started
Visit the [quick start guide](https://twinnydotdev.github.io/twinny-docs/general/quick-start/) to get started.## Main Features
### Fill in the Middle Code Completion
Get AI-based suggestions in real time. Let Twinny autocomplete your code as you type.
![Fill in the Middle Example](https://github.com/rjmacarthy/twinny/assets/5537428/69f567c0-2700-4474-b621-6099255bc87b)### Chat with AI About Your Code
Discuss your code via the sidebar: get function explanations, generate tests, request refactoring, and more.### Additional Features
- Operates online or offline
- Highly customizable API endpoints for FIM and chat
- Chat conversations are preserved
- Conforms to the OpenAI API standard
- Supports single or multiline fill-in-middle completions
- Customizable prompt templates
- Generate git commit messages from staged changes
- Easy installation via the Visual Studio Code extensions marketplace
- Customizable settings for API provider, model name, port number, and path
- Compatible with Ollama, llama.cpp, oobabooga, and LM Studio APIs
- Accepts code solutions directly in the editor
- Creates new documents from code blocks
- View side by side diff of code blocks
- Open chat in full screen mode
- Copies generated code solution blocks
- Workspace embeddings for context-aware AI assistance
- Connect to the Symmetry network for P2P AI inference
- Become a provider on the Symmetry network and share your computational resources with the world
### Workspace Embeddings
Enhance your coding experience with context-aware AI assistance using workspace embeddings.
- **Embed Your Workspace**: Easily embed your entire workspace with a single click.
- **Context-Aware Responses**: twinny uses relevant parts of your codebase to provide more accurate and contextual answers.
- **Customizable Embedding Provider**: By default, uses Ollama Embedding (all-minilm:latest), but supports various providers.
- **Adjustable Relevance**: Fine-tune the rerank probability threshold to control the inclusion of context in AI responses.
- **Toggle Embedded Context**: Easily switch between using embedded context or not for each message.### Symmetry network
[Symmetry](https://twinny.dev/symmetry) is a decentralized peer-to-peer network tool designed to democratize access to computational resources for AI inference. Key features include:- Resource Sharing: Users can offer or seek computational power for various AI tasks.
- Direct Connections: Enables secure, peer-to-peer connections between users.
- Visual Studio Code Integration: Twinny has built-in functionality to connect as a peer or provider directly within VS Code.
- Public Provider Access: Users can leverage models from other users who are public providers on the Symmetry network.Symmetry aims to make AI inference more accessible and efficient for developers and researchers.
The client source code is open source and can be found [here](https://github.com/twinnydotdev/symmetry-core).
## Known Issues
Visit the GitHub [issues page](https://github.com/rjmacarthy/twinny/issues) for known problems and troubleshooting.## Contributing
Interested in contributing? Reach out on [Twitter](https://x.com/twinnydotdev), describe your changes in an issue, and submit a PR when ready. Twinny is open-source under the MIT license. See the [LICENSE](https://github.com/rjmacarthy/twinny/blob/master/LICENSE) for more details.## Support Twinny
Thanks for using Twinny!
This project is and will always be free and open source. If you find it helpful, please consider showing your appreciation with a small donation <3
Bitcoin: `1PVavNkMmBmUz8nRYdnVXiTgXrAyaxfehj`Follow my journey on X for updates! https://x.com/rjmacarthy
## Disclaimer
Twinny is actively developed and provided "as is". Functionality may vary between updates.