https://github.com/phrugsa-limbunlom/picksmart
PickSmart AI-powered shopping assistant platform for personalized product recommendations implemented by Groq API, Tavily, Langchain, and Langgraph
https://github.com/phrugsa-limbunlom/picksmart
agents fast-api kafka langchain langgraph mongodb rag tavily
Last synced: about 2 months ago
JSON representation
PickSmart AI-powered shopping assistant platform for personalized product recommendations implemented by Groq API, Tavily, Langchain, and Langgraph
- Host: GitHub
- URL: https://github.com/phrugsa-limbunlom/picksmart
- Owner: phrugsa-limbunlom
- License: mit
- Created: 2024-12-14T13:12:28.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2025-03-20T19:15:54.000Z (7 months ago)
- Last Synced: 2025-03-30T01:37:42.074Z (7 months ago)
- Topics: agents, fast-api, kafka, langchain, langgraph, mongodb, rag, tavily
- Language: Python
- Homepage:
- Size: 1.56 MB
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
PickSmart π
![]()
![]()
![]()
![]()
![]()
AI-powered shopping assistant platform for real-time product search with contextual question-answering and personalized product recommendations empowered by the intelligent search and analyst agent:

## π Overview
PickSmart is a distributed and AI-powered product discovery platform that leverages the large language model and distributed system to provide real-time product search, contextual question-answering, and personalized product recommendations. The system integrates a Retrieval-Augmented Generation (RAG) architecture with a search agent (Hybrid RAG) for product discovery across multiple e-commerce marketplaces.
## π Features
| Name | Description |
|-------|-------------|
| **Ask a question about a product** | Submit a query to get detailed information, including specifications, pricing, and availability. |
| **Search multiple marketplaces and rank the results**| The system aggregates product listings from various online marketplaces, compares and ranks them based on relevance, price, and customer reviews. |
| **Receive personalized recommendations** | Get AI-driven suggestions tailored to users' preferences, helping users make informed decisions. |## β‘Core Capabilities
| Function | Description |
|-----------|-------------|
|**Natural Language Query Processing** | Advanced query decomposition and semantic analysis
| **Vector Database Integration** | Efficient data ingestion and storage for embedding-based semantic retrieval.
| **Distributed Real-time Search** | Multi-marketplace product discovery with parallel processing
| **Intelligent Product Ranking** | AI-powered relevance scoring and personalization
| **Context-Aware Recommendations** | Detailed answers with product suggestions
| **Scalable Data Processing** | Event-driven architecture for high-throughput data handling## π₯ Demo
https://github.com/user-attachments/assets/47f57b9d-7c8b-4d37-8943-06d1eba58961
## π οΈ Tech Stack
- **Frontend**: React
- **Backend**: FastAPI
- **Streaming**: Kafka
- **RAG System**: LangChain, MongoDB (vector store)
- **Agents**: LangGraph, Tavily (search)
- **API Client**: Groq API## πSystem Architecture
![]()
### Frontend Layer
- **Framework**: React.js with TypeScript
- **State Management**: Redux for predictable state container
- **API Integration**: Axios for HTTP client### Backend Services
- **API Framework**: FastAPI with asynchronous request handling and high-performance routing
- **Message Broker**: Apache Kafka for event streaming and distributed processing
- **Vector Store**: MongoDB for efficient similarity search and embedding storage
- **Search Engine**: Tavily API integration for real-time and accurate web search capabilities### AI Components
- **LLM Integration**: Groq API for high-performance inference
- **Agent Framework**: LangChain for composable AI components
- **Workflow Orchestration**: LangGraph for agent coordination and planning
- **RAG System**: MongoDB for vector index implementation and semantic retrieval### Agent Workflow
The system implements a search and analyst agent with a multi-step workflow using LangGraph for orchestration:
```python
graph = StateGraph(SearchAgentState)
graph.add_node("analyze_query", self.analyze_query_node)
graph.add_node("search_online_shop", self.search_online_node)
graph.add_node("analyze_and_rank", self.analyze_rank_node)
graph.add_node("search_product_source", self.search_source_node)
graph.set_entry_point("analyze_query")
graph.add_edge("analyze_query", "search_online_shop")
graph.add_edge("search_online_shop", "analyze_and_rank")
graph.add_edge("analyze_and_rank", "search_product_source")
graph.set_finish_point("analyze_and_rank")
self.graph = graph.compile(checkpointer=checkpointer)
```The search and analyst agent progresses through the following sequential states:
| State | Name | Description |
|:-----:|-------|-------------|
| 1 | π **Query Analysis State** | Analyzes and decomposes the user query before identifying search intent and extracts key product attributes, serving as the entry point for all search requests |
| 2 | π **Online Shop Search State** | Performs search across multiple online websites, retrieves initial product information from available sources, and gathers raw product data for further analysis |
| 3 | β **Analysis and Ranking State** | Evaluates and ranks products using multiple criteria, prioritizing results based on relevance and quality to deliver the most appropriate options to users |
| 4 | π **Product Source Search State** | Extends search to discover additional product sources across e-commerce platforms, provides a direct purchasing link, and validates product availability |The workflow follows a linear progression through these states, from which each state is built upon the results of the previous states.
## π Folder Structure
This project contains a chatbot application with a backend server, Kafka integration, and a frontend interface.
```bash
chatbot-app
βββ public # Static assets for the frontend
βββ src # Source code for the frontend
βββ .gitignore # Git ignore file for frontend
βββ Dockerfile # Dockerfile for frontend containerization
βββ package-lock.json # Dependency lock file
βββ package.json # Node.js project configurationchatbot-server
βββ src # Source code for the backend
β βββ agent # AI agent-related logic
β βββ constants # Configuration and constant values
β βββ data # Data handling and storage logic
β βββ processor # Data processing and transformation logic
β βββ service # Business logic and service layer
β βββ utils # Utility functions and helpers
β βββ chatbot.py # Main chatbot logic
β βββ main.py # Entry point for the backend server
βββ .gitignore # Git ignore file for backend
βββ Dockerfile # Dockerfile for backend containerization
βββ __init__.py # Marks the package directory
βββ model.yaml # AI model configuration file
βββ requirements.txt # Python dependencieskafka
βββ config # Kafka configuration files
βββ Dockerfile # Dockerfile for Kafka setup
βββ .gitignore # Git ignore file for KafkaLICENSE # Project license file
```
## βοΈ Configuration### Environment Variables
Create ```.env``` file under ```chatbot-server``` folder and add API keys and MongoDB configuration as below:```env
GROQ_API_KEY=""
TAVILY_API_KEY=""
CO_API_KEY=""
MONGO_USER_NAME=""
MONGO_PASSWORD=""
MONGO_CLUSTER="picksmart-cluster"
MONGO_DATABASE="picksmart"
```
- To get Groq api key: https://console.groq.com/keys
- To get Tavily api key: https://tavily.com/
- To get Cohere api key: https://dashboard.cohere.com/
- To set up MongoDB: https://www.mongodb.com/### Model Configuration
Configure the LLM model in ```model.yaml``` file:
```yaml
LLM:
```## π’ Deployment
### Docker Deployment
The application supports containerized deployment using Docker and Docker Compose for simplified orchestration.1. Clone the repository:
```bash
git clone https://github.com/phrugsa-limbunlom/PickSmart.git
cd PickSmart
```2. Create the network:
```bash
docker network create chatbot-network
```3. Deploy services:
```bash
docker-compose up --build
```The application will be accessible at `localhost:3000`.
### Manual Installation
1. Clone the repository:
```bash
git clone https://github.com/phrugsa-limbunlom/PickSmart.git
cd PickSmart
```2. Install backend dependencies:
```bash
pip install -r /chatbot-server/requirements.txt
```3. Install frontend dependencies:
```bash
cd chatbot-app
npm install
```4. Initialize Kafka services
5. Launch backend server:
```bash
uvicorn main:app --reload
```6. Start frontend application:
```bash
npm start
```## π» System Requirements
- Python 3.8+
- Node.js 14+
- Docker 20.10+
- Docker Compose 2.0+
- 8GB RAM minimum
- 4 CPU cores recommended## π License
PickSmart is released under the MIT License. See the [LICENSE](https://github.com/phrugsa-limbunlom/PickSmart/blob/main/LICENSE) file for more details.