https://github.com/ineelhere/shiny.ollama
Chat offline with open-source LLMs like deepseek-r1, nemotron, qwen, llama and more all through a simple R package powered by Shiny and Ollama. 🚀
https://github.com/ineelhere/shiny.ollama
cran deepseek-r1 llama3 llm local-llm offline-first offline-llm ollama ollama-app ollama-gui package r shiny shinyapp
Last synced: 3 months ago
JSON representation
Chat offline with open-source LLMs like deepseek-r1, nemotron, qwen, llama and more all through a simple R package powered by Shiny and Ollama. 🚀
- Host: GitHub
- URL: https://github.com/ineelhere/shiny.ollama
- Owner: ineelhere
- License: apache-2.0
- Created: 2024-12-25T14:47:20.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2025-02-10T17:41:21.000Z (3 months ago)
- Last Synced: 2025-02-10T18:34:17.756Z (3 months ago)
- Topics: cran, deepseek-r1, llama3, llm, local-llm, offline-first, offline-llm, ollama, ollama-app, ollama-gui, package, r, shiny, shinyapp
- Language: R
- Homepage: https://ineelhere.github.io/shiny.ollama/
- Size: 4.33 MB
- Stars: 6
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE.md
Awesome Lists containing this project
README
# `shiny.ollama`
[](https://cran.r-project.org/package=shiny.ollama)
[](https://github.com/ineelhere/shiny.ollama)**R Shiny Interface for Chatting with LLMs Offline via Ollama**
*Experience seamless, private, and offline AI conversations right on your machine! `shiny.ollama` provides a user-friendly R Shiny interface to interact with LLMs locally, powered by [Ollama](https://ollama.com).*
[](https://www.indraneelchakraborty.com/shiny.ollama/)
[](https://hits.sh/github.com/ineelhere/shiny.ollama/)
[](https://CRAN.R-project.org/package=shiny.ollama)
`Development Version`

`CRAN release`

## ⚠️ Disclaimer
**Important:** `shiny.ollama` requires Ollama to be installed on your system. Without it, this package will not function. Follow the [Installation Guide](#-how-to-install-ollama) below to set up Ollama first.## Version Information
- **`CRAN` Version (`0.1.1`)**:
- Core functionality for offline LLM interaction
- Basic model selection and chat interface
- Chat history export capabilities- **`Development` Version (`0.1.2`)**:
- All features from `0.1.1`
- Better UI/UX
- Advanced parameter customization
- Enhanced user control over model behavior## Installation
### From CRAN (Stable Version - `0.1.1`)
```r
install.packages("shiny.ollama")
```### From GitHub (Latest Development Version - `0.1.2`)
```r
# Install devtools if not already installed
install.packages("devtools")devtools::install_github("ineelhere/shiny.ollama")
```## Quick Start
Launch the Shiny app in R with:
```r
library(shiny.ollama)# Start the application
shiny.ollama::run_app()
```## Features
### Core Features (All Versions)
- **Fully Offline**: No internet required – complete privacy
- **Model Selection**: Easily choose from available LLM models
- **Message Input**: Engage in conversations with AI
- **Save & Download Chats**: Export your chat history
- **User-Friendly Interface**: Powered by R Shiny### Advanced Features (Development Version `0.1.2`)
Customize your LLM interaction with adjustable parameters:
- Temperature control
- Context window size
- Top K sampling
- Top P sampling
- System instructions customization## How to Install Ollama
To use this package, install Ollama first:1. Download Ollama from [here](https://ollama.com) (Mac, Windows, Linux supported)
2. Install it by following the provided instructions
3. Verify your installation:
```sh
ollama --version
```
If successful, the version number will be displayed4. Pull a model (e.g., [deepseek-r1](https://ollama.com/library/deepseek-r1)) to get started
## License and Declaration
This R package is an independent, passion-driven open source initiative, released under the **`Apache License 2.0`**. It is not affiliated with, owned by, funded by, or influenced by any external organization. The project is dedicated to fostering a community of developers who share a love for coding and collaborative innovation.Contributions, feedback, and feature requests are always welcome!
Stay tuned for more updates. 🚀