Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/rashadphz/farfalle
๐ AI search engine - self-host with local or cloud LLMs
https://github.com/rashadphz/farfalle
fastapi generative-ui gpt-4o groq llm nextjs ollama openai perplexity react search-engine shadcn-ui tailwindcss
Last synced: 4 days ago
JSON representation
๐ AI search engine - self-host with local or cloud LLMs
- Host: GitHub
- URL: https://github.com/rashadphz/farfalle
- Owner: rashadphz
- License: apache-2.0
- Created: 2024-04-25T21:24:27.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2024-09-27T07:52:45.000Z (4 months ago)
- Last Synced: 2024-10-29T15:18:05.911Z (2 months ago)
- Topics: fastapi, generative-ui, gpt-4o, groq, llm, nextjs, ollama, openai, perplexity, react, search-engine, shadcn-ui, tailwindcss
- Language: TypeScript
- Homepage: https://www.farfalle.dev/
- Size: 2.18 MB
- Stars: 2,688
- Watchers: 26
- Forks: 242
- Open Issues: 42
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- StarryDivineSky - rashadphz/farfalle - o๏ผ (A01_ๆๆฌ็ๆ_ๆๆฌๅฏน่ฏ / ๅคง่ฏญ่จๅฏน่ฏๆจกๅๅๆฐๆฎ)
- project-awesome - rashadphz/farfalle - ๐ AI search engine - self-host with local or cloud LLMs (TypeScript)
- AiTreasureBox - rashadphz/farfalle - 01-07_3034_4](https://img.shields.io/github/stars/rashadphz/farfalle.svg)|๐ AI search engine - self-host with local or cloud LLMs| (Repos)
README
# Farfalle
Open-source AI-powered search engine. (Perplexity Clone)
Run local LLMs (**llama3**, **gemma**, **mistral**, **phi3**), custom LLMs through **LiteLLM**, or use cloud models (**Groq/Llama3**, **OpenAI/gpt4-o**)
https://github.com/rashadphz/farfalle/assets/20783686/9527a8c9-a13b-4e53-9cda-a3ab28d671b2
Please feel free to contact me on [Twitter](https://twitter.com/rashadphz) or [create an issue](https://github.com/rashadphz/farfalle/issues/new) if you have any questions.
## ๐ป Live Demo
[farfalle.dev](https://farfalle.dev/) (Cloud models only)
## ๐ Overview
- ๐ ๏ธ [Tech Stack](#%EF%B8%8F-tech-stack)
- ๐๐ฟโโ๏ธ [Getting Started](#%EF%B8%8F-getting-started)
- ๐ [Deploy](#-deploy)## ๐ฃ๏ธ Roadmap
- [x] Add support for local LLMs through Ollama
- [x] Docker deployment setup
- [x] Add support for [searxng](https://github.com/searxng/searxng). Eliminates the need for external dependencies.
- [x] Create a pre-built Docker Image
- [x] Add support for custom LLMs through LiteLLM
- [x] Chat History
- [x] Expert Search
- [ ] Chat with local files## ๐ ๏ธ Tech Stack
- Frontend: [Next.js](https://nextjs.org/)
- Backend: [FastAPI](https://fastapi.tiangolo.com/)
- Search API: [SearXNG](https://github.com/searxng/searxng), [Tavily](https://tavily.com/), [Serper](https://serper.dev/), [Bing](https://www.microsoft.com/en-us/bing/apis/bing-web-search-api)
- Logging: [Logfire](https://pydantic.dev/logfire)
- Rate Limiting: [Redis](https://redis.io/)
- Components: [shadcn/ui](https://ui.shadcn.com/)## Features
- Search with multiple search providers (Tavily, Searxng, Serper, Bing)
- Answer questions with cloud models (OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3)
- Answer questions with local models (llama3, mistral, gemma, phi3)
- Answer questions with any custom LLMs through [LiteLLM](https://litellm.vercel.app/docs/providers)
- Search with an agent that plans and executes the search for better results## ๐๐ฟโโ๏ธ Getting Started Locally
### Prerequisites
- [Docker](https://docs.docker.com/get-docker/)
- [Ollama](https://ollama.com/download) (If running local models)
- Download any of the supported models: **llama3**, **mistral**, **gemma**, **phi3**
- Start ollama server `ollama serve`### Get API Keys
- [Tavily (Optional)](https://app.tavily.com/home)
- [Serper (Optional)](https://serper.dev/dashboard)
- [OpenAI (Optional)](https://platform.openai.com/api-keys)
- [Bing (Optional)](https://www.microsoft.com/en-us/bing/apis/bing-web-search-api)
- [Groq (Optional)](https://console.groq.com/keys)### Quick Start:
```
git clone https://github.com/rashadphz/farfalle.git
cd farfalle && cp .env-template .env
```
Modify .env with your API keys (Optional, not required if using Ollama)Start the app:
```
docker-compose -f docker-compose.dev.yaml up -d
```Wait for the app to start then visit [http://localhost:3000](http://localhost:3000).
For custom setup instructions, see [custom-setup-instructions.md](/custom-setup-instructions.md)
## ๐ Deploy
### Backend
[![Deploy to Render](https://render.com/images/deploy-to-render-button.svg)](https://render.com/deploy?repo=https://github.com/rashadphz/farfalle)
After the backend is deployed, copy the web service URL to your clipboard.
It should look something like: https://some-service-name.onrender.com.### Frontend
Use the copied backend URL in the `NEXT_PUBLIC_API_URL` environment variable when deploying with Vercel.
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Frashadphz%2Ffarfalle&env=NEXT_PUBLIC_API_URL&envDescription=URL%20for%20your%20backend%20application.%20For%20backends%20deployed%20with%20Render%2C%20the%20URL%20will%20look%20like%20this%3A%20https%3A%2F%2F%5Bsome-hostname%5D.onrender.com&root-directory=src%2Ffrontend)
And you're done! ๐ฅณ
## Use Farfalle as a Search Engine
To use Farfalle as your default search engine, follow these steps:
1. Visit the settings of your browser
2. Go to 'Search Engines'
3. Create a new search engine entry using this URL: http://localhost:3000/?q=%s.
4. Add the search engine.