Tech stack: Ollama, TypeScript, React, Vite, Electron
Project Links
Gallery


Description
Ollama Chat App is a desktop Electron app that lets you run and interact with local Ollama AI models in a clean, intuitive chat interface. Users can download models directly within the app, with clear progress indicators for both overall download status and individual model layers. Once a model is selected, the chat interface streams responses in real time and displays them in Markdown, allowing rich formatting and readability. The interface is designed to keep interactions simple and focused, allowing users to manage multiple models and switch between them easily.
Future development aims to extend the app with features such as saving and loading past chats, adding speech-to-text and text-to-speech capabilities for more natural conversations, and supporting media insertion. This includes enabling models to generate and display images, further enriching the interactive experience.
GitHub ReadMe
Ollama Chat App
A desktop Electron application for interacting with local Ollama AI models through a clean, responsive chat interface.
Overview
Ollama Chat App is a desktop application designed to make working with local large language models simple and accessible. It provides a focused chat-based interface for running Ollama models locally, removing the friction of command-line interaction while keeping everything on-device.
The application is built to feel lightweight and intuitive, prioritising fast feedback, clarity, and ease of model management.
Model Management
Users can browse and download Ollama models directly within the app. Model downloads display clear progress indicators, including both overall progress and per-layer status, making it easy to track large model downloads.
Once downloaded, models can be selected instantly, allowing users to switch between different models without restarting the application.
Chat Interface
The chat interface streams model responses in real time, providing immediate feedback as tokens are generated. Messages are rendered in Markdown, enabling rich formatting such as code blocks, lists, and emphasis for improved readability.
The interface is intentionally minimal, keeping the focus on the conversation while supporting extended interactions with local models.
Architecture and Stack
The app is built using Electron for cross-platform desktop support, with a modern frontend stack powered by React, Vite, and TypeScript. This combination allows for rapid development, strong type safety, and a smooth desktop user experience.
Ollama runs locally and handles all model inference, ensuring prompts and responses never leave the user’s machine.
Planned Features
Future development is planned to expand the app’s capabilities, including:
- Saving and loading past chat sessions
- Speech-to-text and text-to-speech for more natural interaction
- Media insertion and display
- Support for image generation by compatible models
Tech Stack
- TypeScript
- React
- Vite
- Electron
- Ollama
Outcome
This project demonstrates how local AI models can be integrated into a polished desktop application. It highlights experience with Electron, real-time streaming interfaces, and practical tooling around local-first AI workflows.