Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/kristofferv98/agent_nexus

Agentic framework for dynamic function calling across latest LLMs (gpt-4o, gemini-2.0-flash, groq modes, and anthropic models). Converts Python functions into provider-specific schemas for autonomous tool use. Features unified API, JSON schema generation, and integrated tool execution handling.
https://github.com/kristofferv98/agent_nexus

agent-orchestration agents anthropic function-calling gemini gemini-2-0-flash-exp gemini-tools groq json-schema llm-inference multi-llm openai parallel-processing schema-generation tool-generator tool-integration tools

Last synced: 14 days ago
JSON representation

Agentic framework for dynamic function calling across latest LLMs (gpt-4o, gemini-2.0-flash, groq modes, and anthropic models). Converts Python functions into provider-specific schemas for autonomous tool use. Features unified API, JSON schema generation, and integrated tool execution handling.

Awesome Lists containing this project

README

        

# Multi-LLM Tool Integration Codebase

Welcome to this multi-LLM tool integration codebase! This repository provides a powerful system for integrating Python functions with various LLM providers like OpenAI, Anthropic, Gemini, and Groq.

## Key Features

- **Dynamic Schema Generation**: Automatically converts Python functions into JSON schemas compatible with major LLM providers
- **Multi-LLM Support**: Seamlessly connects with OpenAI, Anthropic, Gemini, and Groq through a unified interface
- **Transparent Processing**: Handles all message flows, tool calls, and responses dynamiclly by passing in functions you want to use

The system allows you to easily create tools from Python functions and use them with the supported LLM providers. It includes features like parallel tool execution, conversation logging, and a simple interface for registering new functions.

##### Example of a parallel and chained tool calls in the main script (with a christmas themeπŸŽ„):
![Parallel and Chained Tool Calls](images/conversation_parallel_tool_calls.png)
*Example showing parallel and chained tool calls working together*

## Getting Started

This documentation will guide you through:
- Understanding the system architecture
- Setting up your environment
- Implementing your own tools
- Integrating with different LLM providers

## Table of Contents

1. [Overview of Key Components](#overview-of-key-components)
2. [Directory Structure](#directory-structure)
3. [Installation and Setup](#installation-and-setup)
4. [Usage Guide](#usage-guide)
- [Generating JSON Schemas with ToolConverter](#generating-json-schemas-with-toolconverter)
- [Registering Functions with LLMHandler](#registering-functions-with-llmhandler)
- [Running the Main Script](#running-the-main-script)
5. [Adding Your Own Tools / Functions](#adding-your-own-tools--functions)
6. [Details of Each Module](#details-of-each-module)
7. [Example Snippets](#example-snippets)
8. [Contributing](#contributing)
9. [License](#license)

---

## Overview of Key Components

This project’s main functionality resides in bridging Python-based β€œtools” (i.e., functions) with various LLM APIs. To accomplish this, we have:

- **ToolConverter**: A utility class that dynamically converts Python functions into JSON schemas for OpenAI-style function calls, then adapts those schemas to Anthropic, Gemini, and Groq formats.
- **LLMHandler**: A high-level orchestrator that sends messages to an LLM, handles tool calls, and returns the final text response to the user. It integrates with a chosen LLM client (OpenAI, Anthropic, Gemini, Groq) through a standard interface (the base API).
- **LLM API Wrappers**: Each LLM vendor (OpenAI, Anthropic, Gemini, Groq) is wrapped in a dedicated module, conforming to a unified interface (defined in base_api.py).
- **MessageHandler**: Manages message formatting, storing system/user/tool messages in a standardized structure.

With these components, you can easily add your own Python functions (tools) that an LLM may call to perform tasks such as math calculations, file I/O, or anything else your application needs.

---

## Directory Structure

Here’s a simplified structure of the repository:

```
.
β”œβ”€β”€ images/
β”‚ └── parallel_tool_calls.png # Example of conversation
β”‚ └── simple_conversation.png # Example of conversation
β”œβ”€β”€ functions/
β”‚ └── math_tools.py # Example math functions (tools)
β”œβ”€β”€ llm_api/
β”‚ β”œβ”€β”€ base_api.py # Abstract base interface for LLM API classes
β”‚ β”œβ”€β”€ openai_api.py # Wrapper for OpenAI
β”‚ β”œβ”€β”€ anthropic_api.py # Wrapper for Anthropic
β”‚ β”œβ”€β”€ gemini_api.py # Wrapper for Gemini
β”‚ └── groq_api.py # Wrapper for Groq
β”œβ”€β”€ llm_tools/
β”‚ β”œβ”€β”€ llm_handler.py # Main orchestrator class for LLM usage
β”‚ └── message_handler.py # Handles message creation and formatting
β”‚ └── conversation_printers.py # Handles printing the conversation
β”œβ”€β”€ tool_converter.py # ToolConverter: generates JSON schemas
β”œβ”€β”€ main_test.py. # A test example for all llm providers testing paralell and chained tools
└── main.py # Example driver script that ties everything together

```

---

## Installation and Setup

1. Clone this repository:
```bash
git clone https://github.com/kristofferv98/Agent_Nexus.git
cd Agent_nexus
```

2. Create and activate a virtual environment (optional, but recommended):
```bash
python3 -m venv venv
source venv/bin/activate
```

3. Install dependencies (example using pip):
```bash
pip install -r requirements.txt
```

4. Set your desieres environment variables for LLM provider credentials (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY, GROQ_API_KEY).
```bash
export OPENAI_API_KEY= "your-api-key"
export ANTHROPIC_API_KEY= "your-api-key"
export GEMINI_API_KEY= "your-api-key"
export GROQ_API_KEY= "your-api-key"
```

---

## Usage Guide

### Generating JSON Schemas with ToolConverter

The ToolConverter takes in a list of Python functions and generates a schema that the LLM can use to understand arguments, parameter validation, and usage. (uses openai gpt-4o to genrate schemas for all functions in paralell)

You’ll find an example in:
```python
class ToolConverter():
...
def generate_schemas(self, functions: List[Callable]) -> dict:
...
```

When you provide a list of functions (e.g., [subtract_numbers, add_numbers, multiply_numbers]), it returns a dictionary containing OpenAI, Anthropic, Gemini, and Groq schemas for those functions.

### Registering Functions with LLMHandler

The LLMHandler manages the conversation loop. Whenever the LLM attempts to call a tool, LLMHandler intercepts the request and calls the actual Python code. Unified for all llms.

For example, in:
```python
class LLMHandler:
...
def register_functions(self, functions: List[Callable]):
for func in functions:
self.register_function(func)
...
```

By calling `register_functions([your_func_1, your_func_2])`, you make those tools available to any LLM you choose to integrate.

### Running the Main Script

An example usage is in β€œmain.py”. It:

1. Initializes ToolConverter.
2. Gathers your tools (e.g., travel time functions).
3. Generates the schemas for all LLM flavors.
4. Instantiates an LLM client (e.g., OpenAI) and an LLMHandler.
5. Registers the tools and sets the system prompt.
6. Sends user messages to the LLM, which can call the newly registered tools as needed.
7. Returns a final answer after tool execution

You can run:
```bash
python main.py
```

### Running Main Test script (testing all llm providers):
You can run:
```bash
python main_test.py
```
Adjust the flags in the script (run_openai, run_anthropic, run_groq, run_gemini) to choose which LLM(s) to test.

---

## Adding Your Own Tools / Functions

1. Import or define new Python functions.
2. Give them docstrings and type annotations (if possible) for clarity and better schema generation.
3. Pass them to the `ToolConverter.generate_schemas()` method.
4. Register them with your LLMHandler instance.

For example, if you have:
```python
def greet_user(name: str) -> str:
"""Greets a user by name."""
return f"Hello, {name}!"
```
Add it to the code flow in β€œmain.py” (or another script) similarly to how math_tools are used.
If complex or complex logic create simple abstractions of more complex functions to simplify schema generation

---

## Details of most relevant Modules

1. **tool_converter.py**
- Responsible for converting Python functions into JSON schemas.
- The key method is `generate_schemas()`, which returns a dictionary containing schemas for multiple LLM providers.

2. **llm_api/base_api.py**
- Abstract base class defining the uniform `generate(messages, tools)` method.
- All LLM API wrappers must subclass BaseLLMAPI.

3. **llm_api/openai_api.py, anthropic_api.py, gemini_api.py, groq_api.py**
- Concrete wrappers implementing each provider’s unique request/response pattern.

4. **llm_tools/message_handler.py**
- Normalizes messages into a standard format.
- Manages user text blocks, system prompts, tool calls, and image data.

5. **llm_tools/llm_handler.py**
- The core orchestrator that calls the LLM, detects β€œtool_use” instructions, and executes the corresponding Python functions.
- Consolidates final text responses.

6. **functions/math_tools.py**
- Sample set of math functions (tools) that demonstrate how to integrate your logic into the system.

7. **main.py**
- A reference script showing how everything ties together.
- Instantiates the ToolConverter, creates schemas, picks an LLM, registers tools, sets the prompt, and interacts with the user.

---

## Example Snippets

Here’s a snippet that shows how you might add a custom function β€œgreet_user” to your main script:

```python
from tool_converter import ToolConverter
from llm_tools.llm_handler import LLMHandler
from llm_api.openai_api import OpenAIAPI

def calculate_dog_age(human_years: int) -> str:
"""Converts human years to approximate dog years using the common rule."""
dog_years = human_years * 7
return f"{human_years} human years is approximately {dog_years} dog years!"

if __name__ == "__main__":
converter = ToolConverter()
# Include your custom function
functions = [calculate_dog_age]

# Generate schemas
schemas = converter.generate_schemas(functions)
print("Schemas:", schemas["gemini"]) # Example: print the Gemini schema

# Create LLM client
gemini_client = GeminiAPI(model_name="gemini-2.0-flash-exp")

# Use the handler
llm_handler = LLMHandler(openai_client)
llm_handler.register_functions(functions)
llm_handler.set_tools(schemas["openai"])

# Set system instructions (optional)
llm_handler.set_system_prompt("You can convert human years to dog years with calculate_dog_age tool.")

# Send user message
response = llm_handler.send_user_message("I'm 25 years old, how old would I be as a dog?")
print("LLM Response:", response)
```

---

## Contributing

We welcome issue reports, feature requests, and pull requests. Please open a GitHub Issue first to discuss significant changes or additions.

---

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

Enjoy building with this multi-LLM tool integration system!