
Your OpenClaw bot on ClawCloud comes with the full model catalog for whatever AI provider you deployed with. This guide covers how to switch between models in a live conversation using OpenClaw's /model command.
What you'll need
- A running OpenClaw bot on ClawCloud (any plan)
- Access to your bot on Telegram or Discord
Available models
The number of models depends on your setup:
| Setup | Models available |
|---|---|
| Managed (default) | 94 curated models from all providers via OpenRouter |
| Anthropic (BYOK) | 7 curated (Claude Haiku 4.5 through Claude Opus 4.6) |
| OpenAI (BYOK) | 21 curated (GPT-4.1 through GPT-5.2, plus o-series reasoning models) |
| Google (BYOK) | 5 curated (Gemini 2.5 Flash through Gemini 3 Pro Preview) |
In BYOK mode, you can only use models from the provider whose API key you provided. In managed mode, all 94 curated models are available through a single OpenRouter key, spanning Anthropic, OpenAI, Google, DeepSeek, Mistral, Meta, xAI, Qwen, and more.
Step 1: Check your current model
Send /model to your bot (no arguments). It will tell you which model is currently active.

Step 2: Switch using an alias
The fastest way to switch is with a short alias. Send /model followed by the alias name:
/model haiku
/model opus
/model sonnet
The bot confirms the switch and uses the new model for all following messages.

Here are all the available aliases:
| Alias | Resolves to |
|---|---|
opus | anthropic/claude-opus-4-6 |
sonnet | anthropic/claude-sonnet-4-5 |
haiku | anthropic/claude-haiku-4-5 |
gpt | openai/gpt-5.2 |
gpt-mini | openai/gpt-5-mini |
gemini | google/gemini-3-pro-preview |
gemini-flash | google/gemini-3-flash-preview |
In BYOK mode, aliases only work within your provider. If your bot uses an Anthropic key, the gpt and gemini aliases won't work. In managed mode (the default), all aliases work since OpenRouter routes to every provider.
Step 3: Switch using a full model name
For models without aliases, use the full name:
/model openai/o4-mini
/model anthropic/claude-sonnet-4-5
/model google/gemini-2.5-flash
The format is always provider/model-name.

When to use which model
There's no single best model. It depends on what you're asking.
Quick factual questions, translations, simple tasks: Use a smaller model like Haiku, GPT-mini, or Gemini Flash. Faster responses, lower API cost.
Writing, analysis, complex reasoning: Use Opus, GPT-5-pro, or Gemini Pro. Slower and more expensive, but the output quality is noticeably better for harder tasks.
Code generation and debugging: The Codex models (OpenAI) or Sonnet (Anthropic) are good middle ground options.
See API Costs Explained for per-model pricing.
Troubleshooting
"/model" returns an error or "command not found": Your bot may need a reboot. Go to your ClawCloud dashboard and click Reboot on your instance.
Alias doesn't work: Make sure you're using an alias that matches your provider. Anthropic bots only recognize opus, sonnet, and haiku. OpenAI bots only recognize gpt and gpt-mini. Google bots only recognize gemini and gemini-flash.
Model switch doesn't seem to change anything: The switch applies to new messages only. Previous messages in the conversation were already generated by the old model.
Deploy Your Bot