models.md +40 −95
2 2
3## Recommended models3## Recommended models
4 4
5
6
7gpt-5.5
8
9OpenAI's newest frontier model for complex coding, computer use, knowledge work, and research workflows in Codex.
10
11codex -m gpt-5.5
12
13Copy command
14
15Capability
16
17Speed
18
19Codex CLI & SDK
20
21Codex app & IDE extension
22
23Codex Cloud
24
25ChatGPT Credits
26
27API Access
28
529
6 30
7gpt-5.431gpt-5.4
98 122
99API Access123API Access
100 124
101125For most tasks in Codex, start with `gpt-5.4`. It combines strong coding,For most tasks in Codex, start with `gpt-5.5` when it appears in your model
102126reasoning, native computer use, and broader professional workflows in one picker. It is strongest for complex coding, computer use, knowledge work, and
103127model. Use `gpt-5.4-mini` when you want a faster, lower-cost option for research workflows. GPT-5.5 is currently available in Codex when you sign in
104128lighter coding tasks or subagents. The `gpt-5.3-codex-spark` model remains with ChatGPT; it isn't available with API-key authentication. During the
105129available in research preview for ChatGPT Pro subscribers and is optimized for rollout, continue using `gpt-5.4` if `gpt-5.5` is not yet available. Use
106130near-instant, text-only iteration. `gpt-5.4-mini` when you want a faster, lower-cost option for lighter coding
131 tasks or subagents. The `gpt-5.3-codex-spark` model is available in research
132 preview for ChatGPT Pro subscribers and is optimized for near-instant,
133 real-time coding iteration.
107 134
108## Alternative models135## Alternative models
109 136
110
111
112gpt-5.2-codex
113
114Advanced coding model for real-world engineering. Succeeded by GPT-5.3-Codex.
115
116codex -m gpt-5.2-codex
117
118Copy command
119
120Show details
121
122137
123 138
124gpt-5.2139gpt-5.2
125 140
126141Previous general-purpose model for coding and agentic tasks across industries and domains. Succeeded by GPT-5.4.Previous general-purpose model for coding and agentic tasks, including hard debugging tasks that benefit from deeper deliberation.
127 142
128codex -m gpt-5.2143codex -m gpt-5.2
129 144
131 146
132Show details147Show details
133 148
134
135
136gpt-5.1-codex-max
137
138Optimized for long-horizon, agentic coding tasks in Codex.
139
140codex -m gpt-5.1-codex-max
141
142Copy command
143
144Show details
145
146
147
148gpt-5.1
149
150Great for coding and agentic tasks across domains. Succeeded by GPT-5.2.
151
152codex -m gpt-5.1
153
154Copy command
155
156Show details
157
158
159
160gpt-5.1-codex
161
162Optimized for long-running, agentic coding tasks in Codex. Succeeded by GPT-5.1-Codex-Max.
163
164codex -m gpt-5.1-codex
165
166Copy command
167
168Show details
169
170
171
172gpt-5-codex
173
174Version of GPT-5 tuned for long-running, agentic coding tasks. Succeeded by GPT-5.1-Codex.
175
176codex -m gpt-5-codex
177
178Copy command
179
180Show details
181
182
183
184gpt-5-codex-mini
185
186Smaller, more cost-effective version of GPT-5-Codex. Succeeded by GPT-5.1-Codex-Mini.
187
188codex -m gpt-5-codex
189
190Copy command
191
192Show details
193
194
195
196gpt-5
197
198Reasoning model for coding and agentic tasks across domains. Succeeded by GPT-5.1.
199
200codex -m gpt-5
201
202Copy command
203
204Show details
205
206## Other models149## Other models
207 150
208151Codex works best with the models listed above.When you sign in with ChatGPT, Codex works best with the models listed above.
209 152
210You can also point Codex at any model and provider that supports either the [Chat Completions](https://platform.openai.com/docs/api-reference/chat) or [Responses APIs](https://platform.openai.com/docs/api-reference/responses) to fit your specific use case.153You can also point Codex at any model and provider that supports either the [Chat Completions](https://platform.openai.com/docs/api-reference/chat) or [Responses APIs](https://platform.openai.com/docs/api-reference/responses) to fit your specific use case.
211 154
218 161
219The Codex CLI and IDE extension use the same `config.toml` [configuration file](https://developers.openai.com/codex/config-basic). To specify a model, add a `model` entry to your configuration file. If you don't specify a model, the Codex app, CLI, or IDE Extension defaults to a recommended model.162The Codex CLI and IDE extension use the same `config.toml` [configuration file](https://developers.openai.com/codex/config-basic). To specify a model, add a `model` entry to your configuration file. If you don't specify a model, the Codex app, CLI, or IDE Extension defaults to a recommended model.
220 163
221164``````toml
222165model = "gpt-5.4"model = "gpt-5.5"
223```166```
224 167
168If `gpt-5.5` isn't available in your account yet, use `gpt-5.4`.
169
225### Choosing a different local model temporarily170### Choosing a different local model temporarily
226 171
227In the Codex CLI, you can use the `/model` command during an active thread to change the model. In the IDE extension, you can use the model selector below the input box to choose your model.172In the Codex CLI, you can use the `/model` command during an active thread to change the model. In the IDE extension, you can use the model selector below the input box to choose your model.
229To start a new Codex CLI thread with a specific model or to specify the model for `codex exec` you can use the `--model`/`-m` flag:174To start a new Codex CLI thread with a specific model or to specify the model for `codex exec` you can use the `--model`/`-m` flag:
230 175
231```bash176```bash
232177codex -m gpt-5.4codex -m gpt-5.5
233```178```
234 179
235### Choosing your model for cloud tasks180### Choosing your model for cloud tasks