https://github.com/intersystems-ib/customer-support-agent-demo
AI-powered customer support agent built with Python smolagents and InterSystems IRIS β SQL, RAG, and live interoperability
https://github.com/intersystems-ib/customer-support-agent-demo
ai-agent customer-support demo docker interoperability intersystems-iris llm openai python rag retrieval-augmented-generation smolagents sql vector-search
Last synced: about 1 month ago
JSON representation
AI-powered customer support agent built with Python smolagents and InterSystems IRIS β SQL, RAG, and live interoperability
- Host: GitHub
- URL: https://github.com/intersystems-ib/customer-support-agent-demo
- Owner: intersystems-ib
- License: mit
- Created: 2025-08-27T08:45:46.000Z (about 1 month ago)
- Default Branch: main
- Last Pushed: 2025-08-30T18:51:43.000Z (about 1 month ago)
- Last Synced: 2025-08-30T20:37:04.163Z (about 1 month ago)
- Topics: ai-agent, customer-support, demo, docker, interoperability, intersystems-iris, llm, openai, python, rag, retrieval-augmented-generation, smolagents, sql, vector-search
- Language: Python
- Homepage:
- Size: 5.24 MB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Customer Support Agent Demo π€π¦
Customer support agent that helps users resolve questions about orders, products, shipping, etc.
In this project you will build an **AI agent** powered by **Python [smolagents](https://huggingface.co/docs/smolagents/index)** and **InterSystems IRIS**.
---
## Why AI Agents? π§
AI agents are programs that use an **LLM as their "mind"** and then **call external tools** (databases, APIs, knowledge basesβ¦) to solve real-world tasks.
π You *could* build such an agent **by hand**:
* Call the LLM directly from your code.
* Parse its responses.
* Figure out yourself when and how to call different tools (SQL, RAG, APIsβ¦).
* Glue everything together.That worksβ¦ but itβs complex and fragile π .
Instead, you can use a **framework**. In this project we use **[smolagents](https://huggingface.co/docs/smolagents/index)**, a lightweight Hugging Face library. It lets you define **tools** and data sources, while the agent handles the reasoning.
* **CodeAgent** (used here) is a smolagents agent that plans by generating Python code step by step.
* It uses the LLM to decide which tools to call and how to combine them until it has an answer.Think of it as a **tiny brain with a toolbox** π§°.
---
## Project architecture ποΈ
This project relies on two key components:
* **π Python** β Runs the AI agent (LLM reasoning + smolagents framework).
* **ποΈ InterSystems IRIS** β Acts as the data and integration platform:* Stores **structured data** (customers, orders, products, shipments).
* Provides a **vector database** for **unstructured data** (docs, FAQs) with RAG queries.
* Includes **interoperability** features to simulate live shipping status calls.Together, they showcase how IRIS enables **real enterprise AI agents** that combine reasoning, structured & unstructured data, and live integrations.
---
## What you will build π
You will build an AI agent that:
* Uses a LLM as its "mind" to plan and decide how to answer questions that users will ask about their orders, products, etc.
* Uses different tools to access structured data (SQL tables), unstructured data (documents using a RAG pattern) and live external information (interoperability).---
## Requirements π
To run this demo youβll need:
* π **[Python 3.9+](https://www.python.org/)**
Runs the AI agent with smolagents and connects to IRIS.* π³ **[Docker](https://docs.docker.com/get-docker/)**
Spins up an **InterSystems IRIS** container (database + vector search + interoperability).* π» **[VS Code](https://code.visualstudio.com/)** (recommended)
For editing and exploring the Python code.* π **[OpenAI API Key](https://platform.openai.com/)**
Used for the LLM βmindβ and also to embed documents for RAG.---
## Setup βοΈ
### 1. Setup Python Environment
Create a virtual environment
```bash
# Mac/Linux
python3 -m venv .venv
# Windows
python -m venv .venv
```Activate the environment:
```bash
# Mac/Linux
source .venv/bin/activate# Windows (PowerShell)
.venv\Scripts\Activate.ps1
```Then, install dependencies:
```bash
pip install -r requirements.txt
```Create a `.env` file:
* Copy `.env.example` to `.env`
* Modify it to include your OpenAI API---
### 2. Setup InterSystems IRIS container
```bash
docker compose build
docker compose up -d
```You can access IRIS [Management Portal](http://localhost:52773/csp/sys/UtilHome.csp) using:
* User: `superuser`
* Password: `SYS`---
## Understanding the repo π
Have a look at this summary of the repo contents to understand the project:
```graphql
customer-support-agent/
ββ .env.example # sample env file (copy to .env)
β
ββ iris/ # InterSystems IRIS assets
β ββ Dockerfile # IRIS container build
β ββ sql/ # SQL scripts
β β ββ schema.sql # creates tables for Customer Support use case
β β ββ load_data.sql # loads CSVs into tables
β β ββ truncate.sql
β ββ data/ # data: customer, orders, etc.
β ββ docs/ # unstructured Knowledge Base (RAG content)
β ββ src/ # IRIS classes. Simulates live shipping status interoperability
β
ββ agent/ # the AI agent (Python) + tools
β ββ customer_support_agent.py # wraps smolagents CodeAgent + tools
β ββ tools/
β ββ sql_tool.py # SQL tools
β ββ rag_tool.py # RAG tools
β ββ shipping_tool.py # calls IRIS interoperability
β
ββ db/ # database adapters & helpers
ββ cli/ # terminal frontend
ββ ui/ # lightweight web UI (Gradio)
ββ scripts/ # utility scripts
```---
## Load SQL Data ποΈ
Before running the agent, we must create the tables and insert some data.
This will be the structured data that the agent will query to answer user questions.Run this SQL sentences in IRIS [SQL Explorer](http://localhost:52773/csp/sys/exp/%25CSP.UI.Portal.SQL.Home.zen?$NAMESPACE=USER) or using your favorite SQL client.
```sql
LOAD SQL FROM FILE '/app/iris/sql/schema.sql' DIALECT 'IRIS' DELIMITER ';'
``````sql
LOAD SQL FROM FILE '/app/iris/sql/load_data.sql' DIALECT 'IRIS' DELIMITER ';'
```Check the data you have just inserted and get yourself familiar with the tables.
Here are some simple queries you can try:```sql
-- List all customers
SELECT * FROM Agent_Data.Customers;-- Get all orders for a given customer
SELECT o.OrderID, o.OrderDate, o.Status, p.Name AS Product
FROM Agent_Data.Orders o
JOIN Agent_Data.Products p ON o.ProductID = p.ProductID
WHERE o.CustomerID = 1;-- Check shipment info for an order
SELECT * FROM Agent_Data.Shipments WHERE OrderID = 1001;-- See products with their warranty
SELECT Name, WarrantyMonths FROM Agent_Data.Products;-- Find orders that are still Processing
SELECT * FROM Agent_Data.Orders WHERE Status = 'Processing';
```---
## Load and embed non structured data π
The agent will be able also to query non structured data using a RAG (Retrieval Augmented Generation) pattern.
For that, we will be leveraging InterSystems IRIS Vector Search features.We will embed the data using OpenAI `text-embedding-3-small` model.
We will leverage an InterSystems IRIS feature that allows us to setup embedding directly in the database.```sql
INSERT INTO %Embedding.Config (Name, Configuration, EmbeddingClass, VectorLength, Description)
VALUES ('my-openai-config',
'{"apiKey":"your-openai-api-key-here",
"sslConfig": "llm_ssl",
"modelName": "text-embedding-3-small"}',
'%Embedding.OpenAI',
1536,
'a small embedding model provided by OpenAI')
```Now, run script that loops over documents and records that needs to be embedded
```bash
python scripts/embed_sql.py
```After that have a look at the tables and check that embeddings are now included.
```sql
-- Products
SELECT * FROM Agent_Data.Products;
``````sql
-- Chunks from documents
SELECT * FROM Agent_Data.DocChunks;
```---
## Interoperability π
One of the **most powerful features of InterSystems IRIS** is its **Interoperability framework**.
It allows you to seamlessly connect your solution to **other systems, APIs, and services** in a reliable, monitored, and traceable way.In this demo, we included a **mock shipping service** that shows how an agent can call a live API to retrieve status and timeline information:
```bash
curl --header "Content-Type: application/json" \
--request POST \
--data '{"orderStatus":"Processing","trackingNumber":"DHL7788"}' \
http://localhost:52773/api/shipping/status
```This information can also be consumed by the agent.
β¨ And since IRIS has a built-in **Visual Trace** viewer, you can actually see each message flowing through the system.
---
## Understanding the agent π€
The agent is a smolagents **CodeAgent**:
* It will use an mini OpenAI LLM model as a mind to plan and decide which tools use.
* It will run several steps and use different tools to try to resolve a user question.β‘οΈ Want to see how it works under the hood? Check out these key files:
* [agent/customer\_support\_agent.py](agent/customer_support_agent.py) β main agent setup
* [agent/tools/sql\_tool.py](agent/tools/sql_tool.py) β SQL queries (orders, products, ranges)
* [agent/tools/rag\_tool.py](agent/tools/rag_tool.py) β RAG queries with IRIS Vector Search
* [agent/tools/shipping\_tool.py](agent/tools/shipping_tool.py) β calls the IRIS interoperability APIFeel free to open them and explore the code β theyβre short, simple, and quite fun to read π.
---
## Running the agent βΆοΈ
You can run the agent in three different ways:
### πΉ One-shot mode
Run a single question + get a single answer. Perfect for quick tests.
```bash
python -m cli.run --email alice@example.com --message "Where is my order #1001?"python -m cli.run --email alice@example.com --message "Show electronics that are good for travel"
python -m cli.run --email alice@example.com --message "Was my headphones order delivered, and whatβs the return window?"
python -m cli.run --email alice@example.com --message "Find headphones under $120 with ANC"
python -m cli.run --email alice@example.com --message "If my order is out of warranty, what options do i have?"
```
---
### πΉ Interactive CLI
Start a small session where you can type multiple questions in a row.
```bash
python -m cli.run --email alice@example.com
```---
### πΉ Web UI (Gradio)
A lightweight chat UI with Gradio, so you can talk to the agent in your browser.
```bash
python -m ui.gradio
```Then open the UI in http://localhost:7860
---
## π Wrap up
* Youβve just set up an **AI-powered customer support agent** that combines **LLM reasoning**, **structured SQL data**, **unstructured docs with RAG**, and **live interoperability APIs**.
* This is a **technical repo**, but hey, customer support is never boring when you have an AI sidekick π€.
* Next steps? Extend the schema, add more tools, or plug it into your own data sources!