Open WebUI App
Feature-rich, self-hosted WebUI for Ollama and OpenAI-compatible APIs. ChatGPT-like experience with your local models.
About
Open WebUI (formerly Ollama WebUI) is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.
Features
- ๐ฌ ChatGPT-like Interface: Familiar, intuitive chat experience
- ๐ Fully Offline: Complete privacy with local processing
- ๐จ Customizable: Themes, settings, and extensions
- ๐ Markdown Support: Rich text formatting with syntax highlighting
- ๐พ Conversation History: Save and organize your chats
- ๐ Model Switching: Switch between models on the fly
- ๐ฑ Responsive Design: Works on desktop and mobile
- ๐ Extensible: Support for pipelines and plugins
Prerequisites
This app works best with the Ollama app installed and running. Install Ollama first to run local LLM models.
Installation
- Install and start the Ollama app (recommended)
- Add the J0rsa repository to your Home Assistant
- Search for โOpen WebUIโ in the App Store (formerly Add-on Store)
- Click Install and wait for the download to complete
- Configure the Ollama API URL if needed
- Start the app
Usage
Accessing the Web Interface
After starting the app, access Open WebUI at:
- Direct Access:
http://homeassistant.local:5000
First-Time Setup
- Open the web interface
- Create an admin account
- Configure your LLM backend (Ollama URL)
- Start chatting!
Connecting to Ollama
By default, Open WebUI connects to Ollama at http://localhost:11434. If your Ollama app is running on the same Home Assistant instance, this should work automatically.
Configuration
Ollama Connection
If Ollama is on a different host:
- Open Open WebUI settings
- Go to โConnectionsโ
- Update the Ollama API URL
User Management
Open WebUI supports multiple users:
- Admin: Full control over settings and users
- Users: Can chat and manage own conversations
Tips
- Model Selection: Use the dropdown to switch between installed Ollama models
- System Prompts: Customize AI behavior with system prompts
- Conversation Export: Export chats for backup or sharing
- Dark Mode: Toggle dark mode in settings
Integration with Ollama
Ensure Ollama is running and has models downloaded:
# Check available models (via Ollama API)
curl http://homeassistant.local:11434/api/tags
Recommended Models
- llama2: General-purpose conversations
- mistral: Fast and capable
- codellama: Code assistance
- neural-chat: Conversational AI
Troubleshooting
Cannot Connect to Ollama
- Verify Ollama app is running
- Check Ollama API URL in settings
- Ensure both apps can communicate
Slow Responses
- Larger models require more resources
- Try smaller models (7B instead of 13B)
- Check system RAM usage
Web Interface Not Loading
- Check that port 5000 is accessible
- Verify the app is running
- Check app logs for errors
Support
| โ Back to Apps | View on GitHub |