https://github.com/muzammil-git/converse-ai
Converse is a demo application showcasing conversational AI using DeepSeek R1, Hugging Face embeddings, and LLaMA Index. It features natural dialogue capabilities, Chroma DB vector storage, and a user-friendly Gradio interface for seamless human-AI interaction.
https://github.com/muzammil-git/converse-ai
deepseek-r1 docker fastapi gradio grok huggingface-embeddings llamaindex postgresql vectordb
Last synced: 8 months ago
JSON representation
Converse is a demo application showcasing conversational AI using DeepSeek R1, Hugging Face embeddings, and LLaMA Index. It features natural dialogue capabilities, Chroma DB vector storage, and a user-friendly Gradio interface for seamless human-AI interaction.
- Host: GitHub
- URL: https://github.com/muzammil-git/converse-ai
- Owner: muzammil-git
- Created: 2025-03-05T00:17:32.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2025-03-05T00:38:44.000Z (8 months ago)
- Last Synced: 2025-03-05T01:25:17.393Z (8 months ago)
- Topics: deepseek-r1, docker, fastapi, gradio, grok, huggingface-embeddings, llamaindex, postgresql, vectordb
- Language: Python
- Homepage:
- Size: 10.7 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: readme.md
Awesome Lists containing this project
README
# Converse: A Conversational AI Demo
## Introduction
Converse is a conversational AI demo that showcases the capabilities of the DeepSeek R1 model, Hugging Face embeddings, and the LLaMA Index framework.## Client-Side
## Server-Side
## Features
* **Conversational Interface**: Engage in natural-sounding conversations with the AI model.
* **Response Generation**: The AI model generates responses based on the user's input.
* **DeepSeek R1 Model**: A conversational AI model developed by Groq.
* **Hugging Face Embeddings**: A library of pre-trained embeddings for natural language processing.
* **LLaMA Index Framework**: A framework for building and querying large language models.
* **Chroma DB**: A vector store for efficient storage and retrieval of embeddings.
* **Client-Side Interface**: Built using Gradio, a Python library for building user interfaces.## Setup and Usage
1. Clone the repository: `git clone https://github.com/your-username/converse.git`
2. Install the required dependencies: `poetry install`
3. Run the server: `poetry run python main.py`
4. Client is up on local URL: `http://127.0.0.1:7860`
5. Open a web browser and navigate to `http://localhost:8000` to interact with the AI model.## Example Use Cases
* **Simple Conversation**: Ask the AI model a question, and it will respond accordingly.
* **Multi-Turn Conversation**: Engage in a conversation with the AI model, and it will respond based on the context.
* **Retrieval-Augmented Generation (RAG)**: Use the AI model to generate responses based on a set of documents or knowledge base.
* **Vector Store**: Use Chroma DB to store and retrieve embeddings for efficient querying.## Future Enhancements
* **Improved Response Generation**: Enhance the AI model's ability to generate more accurate and informative responses.
* **Client Side**: The human interaction part, markdown rendering.
* **Emotional Intelligence**: Integrate emotional intelligence into the AI model to make it more empathetic and understanding.
* **Multimodal Interaction**: Allow users to interact with the AI model using multiple modalities, such as voice or text.