https://github.com/stippi/code-assistant
An LLM-powered, autonomous coding assistant. Also offers an MCP mode.
https://github.com/stippi/code-assistant
agentic-ai assistant claude-3-5-sonnet claude-3-7-sonnet mcp-server
Last synced: 26 days ago
JSON representation
An LLM-powered, autonomous coding assistant. Also offers an MCP mode.
- Host: GitHub
- URL: https://github.com/stippi/code-assistant
- Owner: stippi
- License: gpl-3.0
- Created: 2024-11-03T17:21:41.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2025-03-17T16:24:43.000Z (about 1 month ago)
- Last Synced: 2025-03-17T16:37:11.698Z (about 1 month ago)
- Topics: agentic-ai, assistant, claude-3-5-sonnet, claude-3-7-sonnet, mcp-server
- Language: Rust
- Homepage:
- Size: 785 KB
- Stars: 1
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- Awesome-MCP-Servers-directory - code-assistant - A coding assistant MCP server that allows to explore a code-base and make changes to code. Should be used with trusted repos only (insufficient protection against prompt injections) (Developer Tools)
- awesome-mcp-servers - CodeAssist - An LLM-powered, autonomous coding assistant. Also offers an MCP mode. (Table of Contents / AI Services)
README
# Code Assistant
[](https://github.com/stippi/code-assistant/actions/workflows/build.yml)
A CLI tool built in Rust for assisting with code-related tasks.
## Features
- **Autonomous Exploration**: The agent can intelligently explore codebases and build up working memory of the project structure.
- **Reading/Writing Files**: The agent can read file contents and make changes to files as needed.
- **Working Memory Management**: Efficient handling of file contents with the ability to load and unload files from memory.
- **File Summarization**: Capability to create and store file summaries for quick reference and better understanding of the codebase.
- **Interactive Communication**: Ability to ask users questions and get responses for better decision-making.
- **MCP Server Mode**: Can run as a Model Context Protocol server, providing tools and resources to LLMs running in an MCP client.## Installation
Ensure you have [Rust installed](https://www.rust-lang.org/tools/install) on your system. Then:
```bash
# Clone the repository
git clone https://github.com/stippi/code-assistant# Navigate to the project directory
cd code-assistant# Build the project
cargo build --release# The binary will be available in target/release/code-assistant
```## Configuration in Claude Desktop
The `code-assistant` implements the [Model Context Protocol](https://modelcontextprotocol.io/introduction) by Anthropic.
This means it can be added as a plugin to MCP client applications such as **Claude Desktop**.### Configure Your Projects
Create a file `.code-assistant/projects.json` in your home directory.
This file adds available projects in MCP server mode (`list_projects` and `open_project` tools).
It has the following structure:```json
{
"code-assistant": {
"path": "/Users//workspace/code-assistant"
},
"asteroids": {
"path": "/Users//workspace/asteroids"
},
"zed": {
"path": "Users//workspace/zed"
}
}
```Notes:
- The absolute paths are not provided by the tool, to avoid leaking such information to LLM cloud providers.
- This file can be edited without restarting Claude Desktop, respectively the MCP server.### Configure MCP Servers
- Open the Claude Desktop application settings (**Claude** -> Settings)
- Switch to the **Developer** tab.
- Click the **Edit Config** button.A Finder window opens highlighting the file `claude_desktop_config.json`.
Open that file in your favorite text editor.An example configuration is given below:
```json
{
"mcpServers": {
"code-assistant": {
"command": "/Users//workspace/code-assistant/target/release/code-assistant",
"args": [
"server"
]
}
}
}
```## Usage
Code Assistant can run in two modes:
### Agent Mode (Default)
```bash
code-assistant --task [OPTIONS]
```Available options:
- `--path `: Path to the code directory to analyze (default: current directory)
- `-t, --task `: Task to perform on the codebase (required unless `--continue-task` or `--ui` is used)
- `--ui`: Start with GUI interface
- `--continue-task`: Continue from previous state
- `-v, --verbose`: Enable verbose logging
- `-p, --provider `: LLM provider to use [anthropic, open-ai, ollama, vertex] (default: anthropic)
- `-m, --model `: Model name to use (defaults: anthropic="claude-3-7-sonnet-20250219", openai="gpt-4o", vertex="gemini-1.5-pro-latest")
- `--base-url `: API base URL for the LLM provider
- `--tools-type `: Type of tool declaration [native, xml] (default: xml) `native` = tools via LLM provider API, `xml` = custom system message
- `--num-ctx `: Context window size in tokens (default: 8192, only relevant for Ollama)
- `--record `: Record API responses to a file for testing (currently Anthropic only)
- `--playback `: Play back a recorded session from a fileEnvironment variables:
- `ANTHROPIC_API_KEY`: Required when using the Anthropic provider
- `OPENAI_API_KEY`: Required when using the OpenAI provider
- `GOOGLE_API_KEY`: Required when using the Vertex providerExamples:
```bash
# Analyze code in current directory using Anthropic's Claude
code-assistant --task "Explain the purpose of this codebase"# Use OpenAI to analyze a specific directory with verbose logging
code-assistant -p open-ai --path ./my-project -t "List all API endpoints" -v# Use Google's Vertex AI with a specific model
code-assistant -p vertex --model gemini-1.5-flash -t "Analyze code complexity"# Continue a previously interrupted task
code-assistant --continue-task# Start with GUI interface
code-assistant --ui
```### Server Mode
Runs as a Model Context Protocol server:
```bash
code-assistant server [OPTIONS]
```Available options:
- `-v, --verbose`: Enable verbose logging## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.