{"id":25315721,"url":"https://github.com/rabbicse/llm","last_synced_at":"2025-04-07T15:46:58.271Z","repository":{"id":275637532,"uuid":"926708699","full_name":"rabbicse/llm","owner":"rabbicse","description":"This repository features a Next.js (React 19) frontend and FastAPI backend, integrating Ollama and DeepSeek-R1 for AI-driven functionality. Designed for efficiency and scalability, it supports real-time updates through event streaming, enabling high-performance AI interactions.","archived":false,"fork":false,"pushed_at":"2025-03-18T10:00:27.000Z","size":82158,"stargazers_count":6,"open_issues_count":1,"forks_count":3,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-03-18T11:23:13.150Z","etag":null,"topics":["chatbot","deepseek-r1","event-streaming","fastapi","java","llm","natural-language-processing","nextjs15","ollama","python","reactjs","spring-ai","spring-boot-3","typescript","vite-react"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/rabbicse.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2025-02-03T18:18:13.000Z","updated_at":"2025-03-18T10:00:30.000Z","dependencies_parsed_at":"2025-02-03T19:30:03.850Z","dependency_job_id":"965acf20-5b4b-4214-9b0b-1da7d2af4a84","html_url":"https://github.com/rabbicse/llm","commit_stats":null,"previous_names":["rabbicse/llm"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rabbicse%2Fllm","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rabbicse%2Fllm/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rabbicse%2Fllm/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rabbicse%2Fllm/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/rabbicse","download_url":"https://codeload.github.com/rabbicse/llm/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247684316,"owners_count":20979045,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["chatbot","deepseek-r1","event-streaming","fastapi","java","llm","natural-language-processing","nextjs15","ollama","python","reactjs","spring-ai","spring-boot-3","typescript","vite-react"],"created_at":"2025-02-13T18:24:49.416Z","updated_at":"2025-04-07T15:46:58.261Z","avatar_url":"https://github.com/rabbicse.png","language":"Python","readme":"# Large Language Models (LLM)\n\n## 🚀 LLM-Powered App with Next.js, React 19, FastAPI, and Event Streaming  \n\nThis repository contains a **LLM-powered application** built with **Next.js (React 19) for the frontend** and **FastAPI with Python** for the backend. It integrates **Ollama** and **DeepSeek-R1** to provide seamless AI-driven functionality. The project is designed for efficient, scalable, and high-performance AI interactions, incorporating event streaming for real-time updates.\n\n---\n\n![ai-chatbot](screenshots/modern-ai-chatbot.webp)\n\n## 🛠️ Tech Stack  \n- **Frontend:** Next.js + React 19 ⚡  \n- **Backend:** FastAPI + Python 🐍  \n- **LLM:** DeepSeek-R1 via Ollama 🤖  \n- **Event Streaming:** Real-time communication  \n\n## ✨ Features  \n- Seamless AI model integration with **FastAPI**  \n- Fast and responsive UI built with **Next.js \u0026 React 19**  \n- Local or cloud-based **LLM inference** using **DeepSeek-R1** and **Ollama**  \n- Scalable and modular **FastAPI backend**  \n- Event streaming for real-time updates  \n\n## 🚀 Getting Started  \n\n### Prerequisites  \nEnsure you have the following installed:  \n- Ollama installed and running  \n- DeepSeek-R1 model available  \n- Python 3.10+  \n- Node.js \u0026 npm  \n\n### Install and serve Ollama\nGo to `https://ollama.com` and click the download button. It will redirect to the download page `https://ollama.com/download`. Download based on your operating system and install it on your system.\n\nThen go to the `models` section. We can get all available models for the Ollama platform at `https://ollama.com/search`. Click on the `deepseek-r1` section for an example. From the dropdown, select your desired tag, e.g., `1.5`. Then copy the run command, for example, `ollama run deepseek-r1:1.5b`, replace **run** with **pull**, and run it on your terminal.\n\n```\nollama pull deepseek-r1:1.5b\n```\n\nThen run the following command to serve it with the default port.\n```\nollama serve\n```\nIt should run on the default port: `11434`\n\nTo run ollama server as remote server, need to set environment variable `OLLAMA_HOST`. For example, to expose ip address need to set as the following way.\n\nFor windows,\n```bash\n$env:OLLAMA_HOST=\"0.0.0.0\"\n```\nThen run the following command.\n```bash\nollama serve\n```\n\nNote: For more configurations like serving as a remote server, I will show the steps later. Stay tuned!!!\n\n### REST API Backend\nI have created a separate README for backend development with FastAPI. You can choose any other language or framework, but the basic mechanisms are similar.  \n[Backend Development](https://github.com/rabbicse/llm/tree/master/projects/chatbot/backend)\n\n### Create Next.js project for frontend\nI have created a separate README for frontend development with `Next.js` and `React 19`. You can choose any other framework, but the basic mechanisms are similar.  \n[Frontend Development](https://github.com/rabbicse/llm/tree/master/projects/chatbot/frontend/ai-chatbot)\n\n## References\n- [Ollama](https://ollama.com)  \n- [FastAPI](https://fastapi.tiangolo.com/)  \n- [React](https://react.dev)  \n- [Next.js](https://nextjs.org/)  \n- [Frontend Github Project](https://github.com/ruizguille/tech-trends-chatbot/tree/master/frontend)  \n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frabbicse%2Fllm","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Frabbicse%2Fllm","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frabbicse%2Fllm/lists"}