https://github.com/railtownai/railtracks
An agentic framework that helps developers build resilient agentic systems
https://github.com/railtownai/railtracks
ai ai-agents ai-agents-framework framework opensource python
Last synced: 18 days ago
JSON representation
An agentic framework that helps developers build resilient agentic systems
- Host: GitHub
- URL: https://github.com/railtownai/railtracks
- Owner: RailtownAI
- License: mit
- Created: 2025-01-16T19:07:24.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2026-02-12T23:32:36.000Z (2 months ago)
- Last Synced: 2026-02-13T03:56:59.541Z (2 months ago)
- Topics: ai, ai-agents, ai-agents-framework, framework, opensource, python
- Language: Python
- Homepage: http://railtracks.org/
- Size: 13.5 MB
- Stars: 113
- Watchers: 1
- Forks: 5
- Open Issues: 124
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: .github/CODE_OF_CONDUCT.md
- Codeowners: .github/CODEOWNERS
- Security: SECURITY.md
Awesome Lists containing this project
README
# Railtracks
## What is Railtracks?
Railtracks is a Python framework for building agentic systems. Agent behavior, tools, and multi-step flows are defined entirely in standard Python using the control flow and abstractions you already know.
```python
import railtracks as rt
# Define a tool (just a function!)
def get_weather(location: str) -> str:
return f"It's sunny in {location}!"
# Create an agent with tools
agent = rt.agent_node(
"Weather Assistant",
tool_nodes=(rt.function_node(get_weather)),
llm=rt.llm.OpenAILLM("gpt-4o"),
system_message="You help users with weather information."
)
# Run it
flow = rt.Flow(name="Weather Flow", entry_point=agent)
result = flow.invoke("What's the weather in Paris?")
# or `await flow.ainvoke("What's the weather in Paris?)` in an async context
print(result.text) # "Based on the current data, it's sunny in Paris!"
```
Execution order, branching, and looping are expressed using standard Python control flow.
## Why Railtracks?
#### Pure Python
```python
# Write agents like regular functions
@rt.function_node
def my_tool(text: str) -> str:
return process(text)
```
- No YAML, no DSLs, no magic strings
- Compatible with standard debuggers
- Full IDE autocomplete and type checking
#### Tool-First Architecture
```python
# Any function becomes a tool
agent = rt.agent_node(
"Assistant",
tool_nodes=[my_tool, api_call]
)
```
- Automatic function-to-tool conversion
- Seamless API and database integration
- MCP protocol support
#### Familiar Interface
```python
# Native Async support
result = await rt.call(agent, query)
```
- Standardized `call` interface, consistent with asyncio patterns
- Built-in validation, error handling, and retries
- Automatic parallelization management
#### Built-in Observability
Railtracks includes a visualizer for inspecting agent runs and evaluations in real-time, run completely locally with no signups required.
See the [Observability documentation](https://railtownai.github.io/railtracks/observability/visualization/) for setup and usage.
## Quick Start
Installation
```bash
pip install railtracks 'railtracks[cli]'
```
Your First Agent
```python
import railtracks as rt
# 1. Create tools (just functions with decorators!)
@rt.function_node
def count_characters(text: str, character: str) -> int:
"""Count occurrences of a character in text."""
return text.count(character)
@rt.function_node
def word_count(text: str) -> int:
"""Count words in text."""
return len(text.split())
# 2. Build an agent with tools
text_analyzer = rt.agent_node(
"Text Analyzer",
tool_nodes=(count_characters, word_count),
llm=rt.llm.OpenAILLM("gpt-4o"),
system_message="You analyze text using the available tools."
)
# 3. Use it to solve the classic "How many r's in strawberry?" problem
text_flow = rt.Flow(
name="Text Analysis Flow"
entry_point=text_analyzer
)
text_flow.invoke("How many 'r's are in 'strawberry'?")
```
## LLM Support
Railtracks integrates with major model providers through a unified interface:
```python
# OpenAI
rt.llm.OpenAILLM("gpt-5")
# Anthropic
rt.llm.AnthropicLLM("claude-4-6-sonnet")
# Local models
rt.llm.OllamaLLM("llama3")
```
Works with **OpenAI**, **Anthropic**, **Google**, **Azure**, and more. See the [full provider list](https://railtownai.github.io/railtracks/llm_support/providers/).
## Contributing
Railtracks is developed in the open. Contributions, bug reports, and feature requests are welcome via [GitHub Issues](https://github.com/RailtownAI/railtracks/issues).
---
Licensed under MIT · Made by the Railtracks team