Using any LLM on Open-WebUI
I self-hosted Open-WebUI to use Llama models that I had deployed through Ollama.
I wanted to try all the models on Groq (Llama, Deepseek, etc.), Anthropic and OpenAI through Open-WebUI and they have good support to configure all these LLMs. Doing this, I now have access to all the major LLMs from one UI.

Open-WebUI Models Preview

Open-WebUI
Below are the models that are now available now.
OpenAI
- gpt-4o-mini-audio-preview-2024-12-17
- gpt-4o-2024-11-20
- gpt-4o-audio-preview-2024-10-01
- gpt-4o-audio-preview
- gpt-4o-mini-realtime-preview-2024-12-17
- gpt-4o-mini-2024-07-18
- gpt-4o-mini
- gpt-4o-mini-realtime-preview
- gpt-4o-realtime-preview-2024-10-01
- gpt-4o-audio-preview-2024-12-17
- gpt-4o-2024-08-06
- gpt-4o
- gpt-4o-realtime-preview-2024-12-17
- gpt-4o-realtime-preview
- gpt-4o-2024-05-13
- chatgpt-4o-latest
- gpt-4-turbo
- gpt-4-turbo-2024-04-09
- gpt-4-turbo-preview
- gpt-4-0125-preview
- gpt-4-1106-preview
- gpt-4-0613
- gpt-4
- gpt-3.5-turbo-1106
- gpt-3.5-turbo-instruct
- gpt-3.5-turbo-instruct-0914
- gpt-3.5-turbo-0125
- gpt-3.5-turbo
- gpt-3.5-turbo-16k-0613
- gpt-3.5-turbo-16k
- omni-moderation-2024-09-26
- omni-moderation-latest
O Series
- o3-mini-2025-01-31
- o1-mini-2024-09-12
- o1-preview-2024-09-12
- o1-mini
- o3-mini
- o1-preview
- o1
- o1-2024-12-17
Deepseek
- deepseek-r1-distill-llama-70b
- deepseek-r1-distill-qwen-32b
Anthropic
- anthropic/claude-3-haiku
- anthropic/claude-3-opus
- anthropic/claude-3-sonnet
- anthropic/claude-3.5-haiku
- anthropic/claude-3.5-sonnet
Meta (Llama)
- llama-guard-3-8b
- llama3-8b-8192
- llama-3.1-8b-instant
- llama-3.2-3b-preview
- llama3-70b-8192
- llama-3.3-70b-versatile
- llama-3.3-70b-specdec
- llama-3.2-11b-vision-preview
- llama-3.2-1b-preview
- llama-3.2-90b-vision-preview
Alibaba (Qwen)
- qwen-2.5-32b
- qwen-2.5-coder-32b
Google (Gemma)
- gemma2-9b-it
Mistral
- mixtral-8x7b-32768