Primary navigation

Codex Models

Meet the AI models that power Codex

gpt-5.4
gpt-5.4

Flagship frontier model for professional work that brings the industry-leading coding capabilities of GPT-5.3-Codex together with stronger reasoning, tool use, and agentic workflows.

codex -m gpt-5.4
Capability
Speed
Codex CLI & SDK
Codex app & IDE extension
Codex Cloud
ChatGPT Credits
API Access
gpt-5.4-mini
gpt-5.4-mini

Fast, efficient mini model for responsive coding tasks and subagents.

codex -m gpt-5.4-mini
Capability
Speed
Codex CLI & SDK
Codex app & IDE extension
Codex Cloud
ChatGPT Credits
API Access
gpt-5.3-codex
gpt-5.3-codex

Industry-leading coding model for complex software engineering. Its coding capabilities now also power GPT-5.4.

codex -m gpt-5.3-codex
Capability
Speed
Codex CLI & SDK
Codex app & IDE extension
Codex Cloud
ChatGPT Credits
API Access
gpt-5.3-codex-spark
gpt-5.3-codex-spark

Text-only research preview model optimized for near-instant, real-time coding iteration. Available to ChatGPT Pro users.

codex -m gpt-5.3-codex-spark
Capability
Speed
Codex CLI & SDK
Codex app & IDE extension
Codex Cloud
ChatGPT Credits
API Access

For most tasks in Codex, start with gpt-5.4. It combines strong coding, reasoning, native computer use, and broader professional workflows in one model. Use gpt-5.4-mini when you want a faster, lower-cost option for lighter coding tasks or subagents. The gpt-5.3-codex-spark model is available in research preview for ChatGPT Pro subscribers and is optimized for near-instant, real-time coding iteration.

Alternative models

gpt-5.2-codex
gpt-5.2-codex

Advanced coding model for real-world engineering. Succeeded by GPT-5.3-Codex.

codex -m gpt-5.2-codex
gpt-5.2
gpt-5.2

Previous general-purpose model for coding and agentic tasks across industries and domains. Succeeded by GPT-5.4.

codex -m gpt-5.2
gpt-5.1-codex-max
gpt-5.1-codex-max

Optimized for long-horizon, agentic coding tasks in Codex.

codex -m gpt-5.1-codex-max
gpt-5.1
gpt-5.1

Great for coding and agentic tasks across domains. Succeeded by GPT-5.2.

codex -m gpt-5.1
gpt-5.1-codex
gpt-5.1-codex

Optimized for long-running, agentic coding tasks in Codex. Succeeded by GPT-5.1-Codex-Max.

codex -m gpt-5.1-codex
gpt-5-codex
gpt-5-codex

Version of GPT-5 tuned for long-running, agentic coding tasks. Succeeded by GPT-5.1-Codex.

codex -m gpt-5-codex
gpt-5-codex-mini
gpt-5-codex-mini

Smaller, more cost-effective version of GPT-5-Codex. Succeeded by GPT-5.1-Codex-Mini.

codex -m gpt-5-codex
gpt-5
gpt-5

Reasoning model for coding and agentic tasks across domains. Succeeded by GPT-5.1.

codex -m gpt-5

Other models

Codex works best with the models listed above.

You can also point Codex at any model and provider that supports either the Chat Completions or Responses APIs to fit your specific use case.

Support for the Chat Completions API is deprecated and will be removed in future releases of Codex.

Configuring models

Configure your default local model

The Codex CLI and IDE extension use the same config.toml configuration file. To specify a model, add a model entry to your configuration file. If you don’t specify a model, the Codex app, CLI, or IDE Extension defaults to a recommended model.

model = "gpt-5.4"

Choosing a different local model temporarily

In the Codex CLI, you can use the /model command during an active thread to change the model. In the IDE extension, you can use the model selector below the input box to choose your model.

To start a new Codex CLI thread with a specific model or to specify the model for codex exec you can use the --model/-m flag:

codex -m gpt-5.4

Choosing your model for cloud tasks

Currently, you can’t change the default model for Codex cloud tasks.