selfhostedworld.com logoselfhostedworld.com

Try describing what you need:

Best open-source alternatives to ChatGPT

OpenAI's conversational AI assistant.

ChatGPT is the widely-adopted large language model interface from OpenAI, used for writing, coding assistance, summarization, and Q&A. Organizations seek self-hosted alternatives to keep sensitive data off third-party servers, to use open-weight models, or to run inference on their own hardware for cost or compliance reasons.

14 alternatives listed
  1. 1Open-WebUI logo
    132.6k
    MIT LicenseOpen Core

    Open WebUI is a self-hosted interface for working with large language models and related AI workflows. It is aimed at users and teams that want a browser-based AI platform they can run on their own infrastructure, with support for local models through Ollama as well as OpenAI-compatible endpoints.

    Open CoreCloud OptionalOfflineMulti-UserPackageDockerKubernetesHelm
    Install:package-managerdockerkuberneteshelm

    Features:

    • offline operation
    • Ollama/OpenAI API integration
    • granular permissions and user groups
    • responsive design
    • PWA mobile offline access

    +5 more

    Auth:ldapoidc-ssooauthproxy-auth
  2. MIT LicenseOpen Core

    NextChat is an AI assistant application built as a lightweight web app and desktop client. It is aimed at users who want a fast chat interface for working with popular AI models such as Claude, DeepSeek, GPT-4, and Gemini, while also supporting self-deployed model servers like RWKV-Runner and LocalAI. The project emphasizes privacy and convenience. It stores data locally in the browser, supports prompt templates, plugins, realtime chat, and long-conversation compression, and includes a PWA-friendly responsive UI. The README also describes an enterprise edition with centralized administration, permissions, knowledge integration, and private deployment options for organizations.

    Open CoreCloud OptionalMulti-UserDockerDockerKubernetesHelm
    Install:dockerdocker-composekuberneteshelmsource

    Features:

    • One-click Vercel deployment
    • Desktop app downloads
    • Self-hosted LLM compatibility
    • Local browser storage
    • Markdown support

    +5 more

    Auth:local
  3. 3GPT4All logo
    77.3k
    MIT Licensefully-open

    GPT4All is a local large language model application and Python client designed for private use on everyday desktops and laptops. It targets users who want to run LLMs without relying on external API calls or GPUs, and it offers downloadable installers for Windows, macOS, Linux, and a community-maintained Flathub package. The project also includes a Python package that wraps llama.cpp-based model usage, making it usable in scripts and applications. Beyond the desktop app, the README highlights LocalDocs for chatting with personal data, GPU acceleration via Vulkan, an OpenAI-compatible HTTP API server, and integrations with tools such as LangChain and Weaviate.

    OfflineDockerPackageBinary
    Install:dockerpackage-managerbinaryflatpaksource

    Features:

    • private local LLM inference
    • desktop chat application
    • LocalDocs for chatting with data
    • Python client
    • OpenAI-compatible HTTP API server

    +5 more

  4. MIT Licensefully-open

    AnythingLLM is an all-in-one AI application for users who want to build a private ChatGPT-like experience around their own documents and workflows. It is aimed at people who need a self-hostable, configurable assistant that can ingest files, connect to different LLM providers, and support multiple users without a complex setup. The project combines a React frontend, Node.js services, document processing, and vector database management into a single monorepo. It includes built-in agents, document pipelines, chat with citations, and support for many local and cloud models, making it suitable for both personal use and production deployments.

    Cloud OptionalOfflineMulti-UserDockerDockerBinary
    Install:dockerdocker-composebinarysource

    Features:

    • chat with docs
    • AI agents
    • multi-user support
    • permissioning
    • document ingestion

    +5 more

  5. 5Flowise logo
    52.0k

    Flowise is a visual application for creating AI agents and related workflows. It is aimed at developers and teams that want to assemble agent logic through a user interface rather than only through code. The project can be run locally with Node.js, installed globally via npm, or deployed with Docker and Docker Compose. The repository contains separate server, UI, and component modules, and it also includes auto-generated API documentation. The README points to cloud and self-hosting options, along with environment-based configuration for operating an instance.

    Cloud OptionalMulti-UserDockerDockerPackage
    Install:dockerdocker-composepackage-managersource

    Features:

    • visual AI agent building
    • local web app
    • Docker deployment
    • self-hosted deployment
    • multi-module architecture

    +2 more

  6. Text Generation Web UI is a browser-based interface for running large language models locally. It is designed for users who want an offline, private environment for interacting with models, whether for chat, instruction-following, multimodal prompts, or free-form text generation. The project targets both hobbyists and advanced users who need flexible model execution and experimentation. It supports several inference backends, provides an OpenAI/Anthropic-compatible API, and includes tools for vision, file uploads, training LoRAs, and image generation. The README also emphasizes extensibility through built-in and community extensions, plus multiple installation paths ranging from portable builds to Docker and source-based setup.

    No TelemetryOfflineBinaryDocker
    Install:binarysourcedocker-compose

    Features:

    • local model inference
    • portable builds
    • multiple backends
    • OpenAI/Anthropic-compatible API
    • tool-calling

    +5 more

  7. 7LocalAI logo
    45.5k
    MIT Licensefully-open

    LocalAI is an open-source AI engine designed for self-hosted deployment. It targets users who want to run modern generative AI workloads locally or inside their own infrastructure, including text, speech, image, and video models, without relying on a GPU. The project presents itself as a drop-in replacement for several popular AI APIs, including OpenAI, Anthropic, and ElevenLabs, while supporting a wide range of backends and hardware accelerators. It also includes built-in agents, a web UI, model management, and features such as RAG, MCP support, embeddings, vision, and realtime audio capabilities. The README emphasizes privacy-first operation, multi-user readiness, and flexible installation via containers, macOS app, source builds, and Kubernetes.

    OfflineMulti-UserDockerBinaryKubernetes
    Install:dockerbinarysourcekubernetes

    Features:

    • Text generation
    • Text-to-audio
    • Audio-to-text
    • Image generation
    • OpenAI-compatible tools API

    +5 more

    Auth:local
  8. 8Jan logo
    41.9k

    Jan is an open-source desktop application positioned as a ChatGPT replacement. It is designed for people who want to download and run large language models locally while keeping control over their data and preserving privacy. The project also connects to cloud-hosted AI providers such as OpenAI, Anthropic, Mistral, Groq, and MiniMax, making it useful for users who want both local and hosted model access in one interface. Jan includes support for custom assistants, an OpenAI-compatible local server for other applications, and Model Context Protocol integration for agent-style workflows. The README presents it as a cross-platform product available for Windows, macOS, and Linux, with installation via direct downloads, Microsoft Store, and Flathub, plus source builds for developers.

    Cloud OptionalOfflinePackageBinary
    Install:package-managerbinarysource

    Features:

    • Local AI models
    • Cloud model integration
    • Custom assistants
    • OpenAI-compatible API
    • Model Context Protocol integration

    +1 more

  9. 9LibreChat logo
    35.8k
    MIT Licensefully-open

    LibreChat is a self-hosted AI chat platform designed for people and teams that want a single interface for working with many AI providers. It combines conversational chat with support for agents, tools, code execution, multimodal interactions, web search, and image generation, while emphasizing control over deployment and data. The project is aimed at users who want a ChatGPT-like experience without being locked into one provider. It supports multiple authentication methods, conversation management features such as branching and presets, and deployment options ranging from local use to cloud hosting. The README also highlights multilingual support, resumable streaming, and export/import features, making it suitable for personal use as well as enterprise-style self-hosted setups.

    Cloud OptionalMulti-UserDocker
    Install:docker

    Features:

    • ChatGPT-inspired UI
    • AI model selection
    • Custom OpenAI-compatible endpoints
    • Code interpreter
    • AI agents and tools integration

    +5 more

    Auth:oauthldaplocal
  10. 10Khoj logo
    34.1k

    Khoj is a personal AI application designed to extend a user’s capabilities as a "second brain." It combines conversational AI, document retrieval, semantic search, and agent creation so people can work with their own notes, files, and online information in one place. The project is aimed at individual users as well as teams and enterprises, with support for self-hosting, a cloud app, and enterprise deployment models. It can be used from multiple clients such as browsers, note-taking apps, desktop, mobile, and chat platforms, and it supports both local and online LLMs. The README emphasizes privacy and flexibility, highlighting that it can run on a personal computer or be accessed through the hosted service.

    Open CoreCloud OptionalOfflineMulti-UserMulti-Tenant
    Install:source

    Features:

    • Chat with local or online LLMs
    • Search personal documents and the internet
    • Use across browser, Obsidian, Emacs, desktop, phone, and WhatsApp
    • Create custom agents
    • Automate research and notifications

    +3 more

What to look for in a ChatGPT alternative

Evaluate which model backends the alternative supports (Ollama, llama.cpp, OpenAI-compatible APIs) and whether it can switch between local and remote models. Look for conversation history management, multi-model routing, and RAG support if you need document Q&A. GPU requirements and inference speed matter significantly for local deployments.