https://github.com/Storia-AI/sage
Chat with any codebase in under two minutes | Fully local or via third-party APIs
https://github.com/Storia-AI/sage
ai anthropic claude copilot developer-tools hacktoberfest hacktoberfest-2024 hacktoberfest2024 langchain llm openai pinecone python rag
Last synced: 3 months ago
JSON representation
Chat with any codebase in under two minutes | Fully local or via third-party APIs
- Host: GitHub
- URL: https://github.com/Storia-AI/sage
- Owner: Storia-AI
- License: apache-2.0
- Created: 2024-08-23T23:14:55.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2024-11-11T04:49:34.000Z (5 months ago)
- Last Synced: 2025-01-05T05:00:36.517Z (4 months ago)
- Topics: ai, anthropic, claude, copilot, developer-tools, hacktoberfest, hacktoberfest-2024, hacktoberfest2024, langchain, llm, openai, pinecone, python, rag
- Language: Python
- Homepage: https://sage.storia.ai
- Size: 9.6 MB
- Stars: 1,135
- Watchers: 9
- Forks: 97
- Open Issues: 27
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- StarryDivineSky - Storia-AI/sage - AI/sage 是一个让你在两分钟内与任何代码库对话的工具。它支持完全本地运行或通过第三方 API 使用。Sage 的核心功能是让你能够快速理解和查询代码库,无需深入研究代码细节。它通过分析代码库的结构和内容,构建一个可交互的知识图谱,然后利用自然语言处理技术来回答你的问题。你可以使用自然语言提问,例如“这个函数的作用是什么?”或者“这个类是如何实现的?”。Sage 支持多种编程语言,并且可以集成到你现有的开发流程中。它旨在提高开发效率,帮助你更快地理解和使用代码库。无论你是新加入团队还是需要快速了解一个项目,Sage 都能提供极大的帮助。 (A01_文本生成_文本对话 / 大语言对话模型及数据)
README
![]()
Sage: Chat with any codebase
![]()
Our chat window, showing a conversation with the Transformers library. 🚀
***
**Sage** is like an open-source GitHub Copilot that helps you learn how a codebase works and how to integrate it into your project without spending hours sifting through the code.
# Main features
- **Dead-simple setup**. Follow our [quickstart guide](https://sage-docs.storia.ai/quickstart) to get started.
- **Runs locally or on the cloud**. When privacy is your priority, you can run the entire pipeline locally using [Ollama](https://ollama.com) for LLMs and [Marqo](https://github.com/marqo-ai/marqo) as a vector store. When optimizing for quality, you can use third-party LLM providers like OpenAI and Anthropic.
- **Wide range of built-in retrieval mechanisms**. We support both lightweight retrieval strategies (with nothing more but an LLM API key required) and more traditional RAG (which requires indexing the codebase). There are many knobs you can tune for retrieval to work well on your codebase.
- **Well-documented experiments**. We profile various strategies (for embeddings, retrieval etc.) on our own benchmark and thoroughly [document the results](benchmarks/retrieval/README.md).# Want your repository hosted?
We're working to make all code on the internet searchable and understandable for devs. You can check out [hosted app](https://sage.storia.ai). We pre-indexed a slew of OSS repos, and you can index your desired ones by simply pasting a GitHub URL.
If you're the maintainer of an OSS repo and would like a dedicated page on Code Sage (e.g. `sage.storia.ai/your-repo`), then send us a message at [[email protected]](mailto:[email protected]). We'll do it for free!

# Extensions & Contributions
We built the code purposefully modular so that you can plug in your desired embeddings, LLM and vector stores providers by simply implementing the relevant abstract classes.
Feel free to send feature requests to [[email protected]](mailto:[email protected]) or make a pull request!
# Contributors