Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/trendy-design/llmchat
Most intuitive unified AI chat interface.
https://github.com/trendy-design/llmchat
ai claude langchain llm local-rag nextjs ollama ollama-client pglite rag shadcn-ui tailwindcss tiptap typescript vector vercel
Last synced: 6 days ago
JSON representation
Most intuitive unified AI chat interface.
- Host: GitHub
- URL: https://github.com/trendy-design/llmchat
- Owner: trendy-design
- License: mit
- Created: 2024-05-11T03:34:28.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2024-10-16T12:44:05.000Z (2 months ago)
- Last Synced: 2024-12-15T10:13:01.061Z (7 days ago)
- Topics: ai, claude, langchain, llm, local-rag, nextjs, ollama, ollama-client, pglite, rag, shadcn-ui, tailwindcss, tiptap, typescript, vector, vercel
- Language: TypeScript
- Homepage: https://llmchat.co
- Size: 7.87 MB
- Stars: 328
- Watchers: 7
- Forks: 62
- Open Issues: 8
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Most intuitive All-in-one AI chat interface.
## Key Features
- 🧠 **Multiple LLM Providers**: Supports various language models, including Ollama.
- 🔌 **Plugins Library**: Enhance functionality with an expandable plugin system, including function calling capabilities.
- 🌐 **Web Search Plugin**: Allows AI to fetch and utilize real-time web data.
- 🤖 **Custom Assistants**: Create and tailor AI assistants for specific tasks or domains.
- 🗣️ **Text-to-Speech**: Converts AI-generated text responses to speech using Whisper.
- 🎙️ **Speech-to-Text**: (Coming soon) Enables voice input for more natural interaction.
- 💾 **Local Storage**: Securely store data locally using in-browser IndexedDB for faster access and privacy.
- 📤📥 **Data Portability**: Easily import or export chat data for backup and migration.
- 📚 **Knowledge Spaces**: (Coming soon) Build custom knowledge bases for specialized topics.
- 📝 **Prompt Library**: Use pre-defined prompts to guide AI conversations efficiently.
- 👤 **Personalization**: Memory plugin ensures more contextual and personalized responses.
- 📱 **Progressive Web App (PWA)**: Installable on various devices for a native-like app experience.## Tech Stack
- 🌍 **Next.js**
- 🔤 **TypeScript**
- 🗂️ **Pglite**
- 🧩 **LangChain**
- 📦 **Zustand**
- 🔄 **React Query**
- 🗄️ **Supabase**
- 🎨 **Tailwind CSS**
- ✨ **Framer Motion**
- 🖌️ **Shadcn**
- 📝 **Tiptap**## Roadmap
- 🎙️ **Speech-to-Text**: Coming soon.
- 📚 **Knowledge Spaces**: Coming soon.## Quick Start
To get the project running locally:
### Prerequisites
- Ensure you have `yarn` or `bun` installed.
### Installation
1. Clone the repository:
```bash
git clone https://github.com/your-repo/llmchat.git
cd llmchat
```2. Install dependencies:
```bash
yarn install
# or
bun install
```3. Start the development server:
```bash
yarn dev
# or
bun dev
```4. Open your browser and navigate to `http://localhost:3000`.
![og_6x](https://github.com/user-attachments/assets/4813a6b5-3294-4056-88bb-c536a45c220c)
## Deployment
Instructions for deploying the project will be added soon.