Skip to main content

Getting Started


Frame 1 (1).png

Your private, open-source, AI assistant platform.

An open-source alternative to ChatGPT, Claude & Gemini. Run LLMs from OpenAI, Anthropic, Ollama, Mistral, Llama, and Google Gemini all together, all on your own private infrastructure.

Out-of-box solutions that you can tune to empower your team with AI. Accelerate your team's AI adoption.

Get started

Using Docker, you can get started by entering the following command in your terminal.

docker run --name promptpanel -p 4000:4000 -v PROMPT_DB:/app/database -v PROMPT_MEDIA:/app/media --pull=always promptpanel/promptpanel:latest


  • Run any large language model, across any inference provider, any way you want. From commercial models like OpenAI, Anthropic, Gemini, or Cohere - to open source models, either hosted or running locally via Ollama.
  • Access controls to assign users to agents without revealing your API tokens or credentials. Enable user sign-up and login with OpenID Connect (OIDC) single sign-on.
  • Bring your own data and store it locally on your instance. Use it safely by pairing it with any language model, whether online or offline.
  • Create custom agent plugins using Python, to customize your AI agent capabilities, and retrieval augmented generation (RAG) pipelines.