Recommended models

Optimized for long-running, agentic coding tasks in Codex.

Smaller, more cost-effective, less-capable version of GPT-5.1-Codex.

Great for for coding and agentic tasks across domains.
Configuring models
Configure your default local model
Both the Codex CLI and Codex IDE Extension use the same config.toml configuration file to set the default model.
To choose your default model, add a model entry into your config.toml. If no entry is set, your version of the Codex CLI or IDE Extension will pick the model.
model="gpt-5.1-codex"
If you regularly switch between different models in the Codex CLI, and want to control more than just the setting, you can also create different Codex profiles.
Choosing temporarily a different local model
In the Codex CLI you can use the /model command during an active session to change the model. In the IDE Extension you can use the model selector next to the input box to choose your model.
To start a brand new Codex CLI session with a specific model or to specify the model for codex exec you can use the --model/-m flag:
codex -m gpt-5.1-codex-mini
Choosing your model for cloud tasks
There is currently no way to control the model for Codex Cloud tasks. It’s currently using gpt-5.1-codex.
Legacy models

Version of GPT-5 tuned for long-running, agentic coding tasks. Succeeded by GPT-5.1-Codex.

Smaller, more cost-effective version of GPT-5-Codex. Succeeded by GPT-5.1-Codex-Mini.

Reasoning model for coding and agentic tasks across domains. Succeeded by GPT-5.1.
Other models
Codex works best with the models listed above.
If you’re authenticating Codex with an API key, you can also point Codex at any model and provider that supports either the Chat Completions or Responses APIs to fit your specific use case.