Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/yyq1025/chatbot

An AI chatbot app built with Next.js, the Vercel AI SDK, Google Gemini, Auth0, and MongoDB
https://github.com/yyq1025/chatbot

auth0 chatbot chatgpt gemini gemini-flash mongodb nextjs vercel-ai vercel-ai-sdk

Last synced: 19 days ago
JSON representation

An AI chatbot app built with Next.js, the Vercel AI SDK, Google Gemini, Auth0, and MongoDB

Awesome Lists containing this project

README

        

# Chatbot

An open-source AI chatbot app built with Next.js, the Vercel AI SDK, Google Gemini, and MongoDB.

## Features

- [Next.js](https://nextjs.org) App Router
- React Server Components (RSCs), Suspense, and Server Actions
- [Vercel AI SDK](https://sdk.vercel.ai/docs) for streaming chat UI
- Support for Google Gemini (default), Anthropic, Cohere, Hugging Face, or custom AI chat models and/or LangChain
- [shadcn/ui](https://ui.shadcn.com)
- Styling with [Tailwind CSS](https://tailwindcss.com)
- [Radix UI](https://radix-ui.com) for headless component primitives
- Icons from [Phosphor Icons](https://phosphoricons.com)
- Chat History with [MongoDB](https://www.mongodb.com/)
- [Auth0](https://auth0.com/) for authentication

## Model Providers

This template ships with Google Gemini `gemini-1.5-flash-latest` as the default. However, thanks to the [Vercel AI SDK](https://sdk.vercel.ai/docs), you can switch LLM providers to [Anthropic](https://anthropic.com), [Cohere](https://cohere.com/), [Hugging Face](https://huggingface.co), or using [LangChain](https://js.langchain.com) with just a few lines of code.

## Running locally

You will need to use the environment variables [defined in `.env.example`](.env.example) to run Next.js AI Chatbot.

> Note: You should not commit your `.env` file or it will expose secrets that will allow others to control access to your various OpenAI and authentication provider accounts.

```bash
pnpm install
pnpm dev
```

Your app template should now be running on [localhost:3000](http://localhost:3000/).