https://github.com/nomadbin/nuxtchat
NuxtChat - 与多个AI模型同时对话,支持的AI模型(DeepSeek/OpenAI/千问/零一万物/Kimi/Minimax/豆包/智谱/混元/阶跃星辰/Ollama)
https://github.com/nomadbin/nuxtchat
chatgpt cloudflare deepseek nuxt openai vue webui
Last synced: 7 months ago
JSON representation
NuxtChat - 与多个AI模型同时对话,支持的AI模型(DeepSeek/OpenAI/千问/零一万物/Kimi/Minimax/豆包/智谱/混元/阶跃星辰/Ollama)
- Host: GitHub
- URL: https://github.com/nomadbin/nuxtchat
- Owner: NomadBin
- License: mit
- Created: 2025-01-22T03:02:02.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2025-01-23T12:01:26.000Z (about 1 year ago)
- Last Synced: 2025-03-15T18:52:18.169Z (11 months ago)
- Topics: chatgpt, cloudflare, deepseek, nuxt, openai, vue, webui
- Language: Vue
- Homepage: https://nuxtchat.com
- Size: 886 KB
- Stars: 3
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.en.md
Awesome Lists containing this project
README
NuxtChat
Engage in conversations with multiple large language models simultaneously. Supports DeepSeek, OpenAI models, and local Ollama models.
---
**English** * [简体中文](./README.md)
## Supported Models
[DeepSeek](https://platform.deepseek.com) / [OpenAI](https://platform.openai.com) / [Qwen](https://help.aliyun.com/zh/model-studio/getting-started/models) / [LingYi WanWu](https://platform.lingyiwanwu.com) / [Kimi](https://platform.moonshot.cn/) / [Minimax](https://platform.minimaxi.com) / [Doubao](https://console.volcengine.com/ark/region:ark+cn-beijing/openManagement?LLM=%7B%7D&OpenTokenDrawer=false) / [Zhipu](https://open.bigmodel.cn) / [Hunyuan](https://cloud.tencent.com/document/product/1729/104753) / [Stepfun](https://platform.stepfun.com) / [Ollama](https://ollama.com/download)
## Quick Start
1. Visit [nuxtchat.com](https://nuxtchat.com).
2. Prepare the API key for the model you wish to use, select the model to start a conversation, and configure the API key in the prompt interface to begin using it.
* The API keys for large models and chat data will be stored locally in your browser, and the model API requests will also be made directly from your local environment.

