Open WebUI

Chat y asistentes IA

A self-hosted AI chat UI compatible with Ollama and OpenAI-compatible APIs. Capable of running fully offline.

4.4
WebAutoalojado (Docker)

¿Qué es Open WebUI?

Open WebUI is an open-source, self-hosted AI chat platform. It supports a wide range of LLM runners including Ollama, OpenAI-compatible APIs, and more, and can operate completely offline. It offers advanced features such as RAG (Retrieval-Augmented Generation), voice and video calls, document processing, and Python tool calling. It supports 9 vector databases including ChromaDB, PostgreSQL, Qdrant, and Milvus, and includes prompt injection protection via LLM-Guard. It is one of the most popular self-hosted AI tools of 2026 for individuals and organizations that prioritize privacy.

Open WebUI captura de pantalla

Planes de precios

1Free (open source)

Características principales

Multi-model support
RAG
Voice & video calls
Document processing
Prompt injection protection
9 vector database support

Ventajas y desventajas

Ventajas

  • Completely free and open source
  • Works offline
  • Excellent privacy protection

Desventajas

  • Requires technical knowledge such as Docker for setup
  • Japanese localization of the UI is incomplete
  • Requires your own GPU/server

Preguntas frecuentes

P. Is Open WebUI free?

R. Yes, it is completely open source and free. However, you need a server or PC (possibly with a GPU) to run it.

P. What is the difference from Ollama?

R. Ollama is an engine for running LLMs locally, while Open WebUI is its frontend (user interface). By connecting Open WebUI to Ollama, you can use local AI through a ChatGPT-like UI.

Herramientas relacionadas

Explora más en AIpedia