R Shiny Interface for Chatting with LLMs Offline via Ollama
Experience seamless, private, and offline AI conversations right on your machine! shiny.ollama provides a user-friendly R Shiny interface to interact with LLMs locally, powered by Ollama.
Development Version

CRAN release

⚠️ Disclaimer
Important: shiny.ollama requires Ollama to be installed on your system. Without it, this package will not function. Follow the Installation Guide below to set up Ollama first.
Version Information
-
CRANVersion (0.1.1):- Core functionality for offline LLM interaction
- Basic model selection and chat interface
- Chat history export capabilities
-
DevelopmentVersion (0.1.2):- All features from
0.1.1 - Better UI/UX
- Advanced parameter customization
- Enhanced user control over model behavior
- All features from
Installation
From CRAN (Stable Version - 0.1.1)
install.packages("shiny.ollama")From GitHub (Latest Development Version - 0.1.2)
# Install devtools if not already installed
install.packages("devtools")
devtools::install_github("ineelhere/shiny.ollama")Quick Start
Launch the Shiny app in R with:
library(shiny.ollama)
# Start the application
shiny.ollama::run_app()Features
How to Install Ollama
To use this package, install Ollama first:
- Download Ollama from here (Mac, Windows, Linux supported)
- Install it by following the provided instructions
- Verify your installation:
If successful, the version number will be displayed
- Pull a model (e.g., deepseek-r1) to get started
License and Declaration
This R package is an independent, passion-driven open source initiative, released under the Apache License 2.0. It is not affiliated with, owned by, funded by, or influenced by any external organization. The project is dedicated to fostering a community of developers who share a love for coding and collaborative innovation.
Contributions, feedback, and feature requests are always welcome!
Stay tuned for more updates. 🚀