models.md +253 −166
1# Codex Models1# Codex Models
2 2
3Meet the AI models that power Codex
4
5## Recommended models3## Recommended models
6 4
75<div class="not-prose grid gap-6 md:grid-cols-2 xl:grid-cols-3">
86 <ModelDetails
97gpt-5.3-codex client:load
108 name="gpt-5.5"
119Most capable agentic coding model to date, combining frontier coding performance with stronger reasoning and professional knowledge capabilities. slug="gpt-5.5"
1210 wallpaperUrl="/images/api/models/gpt-5.5.jpg"
1311codex -m gpt-5.3-codex description="OpenAI's newest frontier model for complex coding, computer use, knowledge work, and research workflows in Codex."
1412 data={{
1513Copy command features: [
1614 {
1715Capability title: "Capability",
1816 value: "",
1917Speed icons: [
2018 "openai.SparklesFilled",
2119Codex CLI & SDK "openai.SparklesFilled",
2220 "openai.SparklesFilled",
2321Codex app & IDE extension "openai.SparklesFilled",
2422 "openai.SparklesFilled",
2523Codex Cloud ],
2624 },
2725ChatGPT Credits {
2826 title: "Speed",
2927API Access value: "",
3028 icons: ["openai.Flash", "openai.Flash", "openai.Flash"],
3129 },
3230 {
3331gpt-5.3-codex-spark title: "Codex CLI & SDK",
3432 value: true,
3533Text-only research preview model optimized for near-instant, real-time coding iteration. Available to ChatGPT Pro users. },
3634 { title: "Codex app & IDE extension", value: true },
3735codex -m gpt-5.3-codex-spark {
3836 title: "Codex Cloud",
3937Copy command value: false,
4038 },
4139Capability { title: "ChatGPT Credits", value: true },
4240 { title: "API Access", value: true },
4341Speed ],
4442 }}
4543Codex CLI & SDK />
4644
4745Codex app & IDE extension<ModelDetails
4846 client:load
4947Codex Cloud name="gpt-5.4"
5048 slug="gpt-5.4"
5149ChatGPT Credits wallpaperUrl="/images/api/models/gpt-5.4.jpg"
5250 description="Flagship frontier model for professional work that brings the industry-leading coding capabilities of GPT-5.3-Codex together with stronger reasoning, tool use, and agentic workflows."
5351API Access data={{
5452 features: [
5553 {
5654 title: "Capability",
5755gpt-5.2-codex value: "",
5856 icons: [
5957Advanced coding model for real-world engineering. Succeeded by GPT-5.3-Codex. "openai.SparklesFilled",
6058 "openai.SparklesFilled",
6159codex -m gpt-5.2-codex "openai.SparklesFilled",
6260 "openai.SparklesFilled",
6361Copy command "openai.SparklesFilled",
6462 ],
6563Capability },
6664 {
6765Speed title: "Speed",
6866 value: "",
6967Codex CLI & SDK icons: ["openai.Flash", "openai.Flash", "openai.Flash"],
7068 },
7169Codex app & IDE extension {
7270 title: "Codex CLI & SDK",
7371Codex Cloud value: true,
7472 },
7573ChatGPT Credits { title: "Codex app & IDE extension", value: true },
7674 {
7775API Access title: "Codex Cloud",
7876 value: false,
7977For most coding tasks in Codex, start with gpt-5.3-codex. It is available for },
8078ChatGPT-authenticated Codex sessions in the Codex app, CLI, IDE extension, and { title: "ChatGPT Credits", value: true },
8179Codex Cloud. API access for GPT-5.3-Codex will come soon. The { title: "API Access", value: true },
8280gpt-5.3-codex-spark model is available in research preview for ChatGPT Pro ],
8381subscribers. }}
82/>
83
84<ModelDetails
85 client:load
86 name="gpt-5.4-mini"
87 slug="gpt-5.4-mini"
88 wallpaperUrl="/images/api/models/gpt-5-mini.jpg"
89 description="Fast, efficient mini model for responsive coding tasks and subagents."
90 data={{
91 features: [
92 {
93 title: "Capability",
94 value: "",
95 icons: [
96 "openai.SparklesFilled",
97 "openai.SparklesFilled",
98 "openai.SparklesFilled",
99 ],
100 },
101 {
102 title: "Speed",
103 value: "",
104 icons: ["openai.Flash", "openai.Flash", "openai.Flash", "openai.Flash"],
105 },
106 {
107 title: "Codex CLI & SDK",
108 value: true,
109 },
110 { title: "Codex app & IDE extension", value: true },
111 {
112 title: "Codex Cloud",
113 value: false,
114 },
115 { title: "ChatGPT Credits", value: true },
116 { title: "API Access", value: true },
117 ],
118 }}
119/>
120
121<ModelDetails
122 client:load
123 name="gpt-5.3-codex"
124 slug="gpt-5.3-codex"
125 wallpaperUrl="/images/codex/codex-wallpaper-1.webp"
126 description="Industry-leading coding model for complex software engineering. Its coding capabilities now also power GPT-5.4."
127 data={{
128 features: [
129 {
130 title: "Capability",
131 value: "",
132 icons: [
133 "openai.SparklesFilled",
134 "openai.SparklesFilled",
135 "openai.SparklesFilled",
136 "openai.SparklesFilled",
137 "openai.SparklesFilled",
138 ],
139 },
140 {
141 title: "Speed",
142 value: "",
143 icons: ["openai.Flash", "openai.Flash", "openai.Flash"],
144 },
145 {
146 title: "Codex CLI & SDK",
147 value: true,
148 },
149 { title: "Codex app & IDE extension", value: true },
150 {
151 title: "Codex Cloud",
152 value: true,
153 },
154 { title: "ChatGPT Credits", value: true },
155 { title: "API Access", value: true },
156 ],
157 }}
158/>
159
160<ModelDetails
161 client:load
162 name="gpt-5.3-codex-spark"
163 slug="gpt-5.3-codex-spark"
164 wallpaperUrl="/images/codex/codex-wallpaper-2.webp"
165 description="Text-only research preview model optimized for near-instant, real-time coding iteration. Available to ChatGPT Pro users."
166 data={{
167 features: [
168 {
169 title: "Capability",
170 value: "",
171 icons: [
172 "openai.SparklesFilled",
173 "openai.SparklesFilled",
174 "openai.SparklesFilled",
175 ],
176 },
177 {
178 title: "Speed",
179 value: "",
180 icons: [
181 "openai.Flash",
182 "openai.Flash",
183 "openai.Flash",
184 "openai.Flash",
185 "openai.Flash",
186 ],
187 },
188 {
189 title: "Codex CLI & SDK",
190 value: true,
191 },
192 { title: "Codex app & IDE extension", value: true },
193 {
194 title: "Codex Cloud",
195 value: false,
196 },
197 { title: "ChatGPT Credits", value: false },
198 { title: "API Access", value: false },
199 ],
200 }}
201/>
202
203</div>
204
205For most tasks in Codex, start with `gpt-5.5` when it appears in your model
206 picker. It is strongest for complex coding, computer use, knowledge work, and
207 research workflows. GPT-5.5 is currently available in Codex when you sign in
208 with ChatGPT or API-key authentication. Use `gpt-5.4-mini` when you want a
209 faster, lower-cost option for lighter coding tasks or subagents. The
210 `gpt-5.3-codex-spark` model is available in research preview for ChatGPT Pro
211 subscribers and is optimized for near-instant, real-time coding iteration.
84 212
85## Alternative models213## Alternative models
86 214
87215<div class="not-prose grid gap-4 md:grid-cols-2 xl:grid-cols-3">
88216 <ModelDetails
89217gpt-5.2 client:load
90218 name="gpt-5.2"
91219Our best general agentic model for tasks across industries and domains. slug="gpt-5.2"
92220 description="Previous general-purpose model for coding and agentic tasks, including hard debugging tasks that benefit from deeper deliberation."
93221codex -m gpt-5.2 collapsible
94222 data={{
95223Copy command features: [
96224 {
97225Show details title: "Capability",
98226 value: "",
99227 icons: [
100228 "openai.SparklesFilled",
101229gpt-5.1-codex-max "openai.SparklesFilled",
102230 "openai.SparklesFilled",
103231Optimized for long-horizon, agentic coding tasks in Codex. "openai.SparklesFilled",
104232 ],
105233codex -m gpt-5.1-codex-max },
106234 {
107235Copy command title: "Speed",
108236 value: "",
109237Show details icons: ["openai.Flash", "openai.Flash", "openai.Flash"],
110238 },
111239 {
112240 title: "Codex CLI & SDK",
113241gpt-5.1 value: true,
114242 },
115243Great for coding and agentic tasks across domains. Succeeded by GPT-5.2. { title: "Codex app & IDE extension", value: true },
116244 {
117245codex -m gpt-5.1 title: "Codex Cloud",
118246 value: false,
119247Copy command },
120248 { title: "ChatGPT Credits", value: true },
121249Show details { title: "API Access", value: true },
122250 ],
123251 }}
124252 />
125253gpt-5.1-codex
126254 </div>
127Optimized for long-running, agentic coding tasks in Codex. Succeeded by GPT-5.1-Codex-Max.
128
129codex -m gpt-5.1-codex
130
131Copy command
132
133Show details
134
135
136
137gpt-5-codex
138
139Version of GPT-5 tuned for long-running, agentic coding tasks. Succeeded by GPT-5.1-Codex.
140
141codex -m gpt-5-codex
142
143Copy command
144
145Show details
146
147
148
149gpt-5-codex-mini
150
151Smaller, more cost-effective version of GPT-5-Codex. Succeeded by GPT-5.1-Codex-Mini.
152
153codex -m gpt-5-codex
154
155Copy command
156
157Show details
158
159
160
161gpt-5
162
163Reasoning model for coding and agentic tasks across domains. Succeeded by GPT-5.1.
164
165codex -m gpt-5
166
167Copy command
168
169Show details
170 255
171## Other models256## Other models
172 257
173258Codex works best with the models listed above.When you sign in with ChatGPT, Codex works best with the models listed above.
174 259
175You can also point Codex at any model and provider that supports either the [Chat Completions](https://platform.openai.com/docs/api-reference/chat) or [Responses APIs](https://platform.openai.com/docs/api-reference/responses) to fit your specific use case.260You can also point Codex at any model and provider that supports either the [Chat Completions](https://platform.openai.com/docs/api-reference/chat) or [Responses APIs](https://platform.openai.com/docs/api-reference/responses) to fit your specific use case.
176 261
183 268
184The Codex CLI and IDE extension use the same `config.toml` [configuration file](https://developers.openai.com/codex/config-basic). To specify a model, add a `model` entry to your configuration file. If you don't specify a model, the Codex app, CLI, or IDE Extension defaults to a recommended model.269The Codex CLI and IDE extension use the same `config.toml` [configuration file](https://developers.openai.com/codex/config-basic). To specify a model, add a `model` entry to your configuration file. If you don't specify a model, the Codex app, CLI, or IDE Extension defaults to a recommended model.
185 270
271```toml
272model = "gpt-5.5"
186```273```
187274model = "gpt-5.2"
188275```If `gpt-5.5` isn't available in your account yet, use `gpt-5.4`.
189 276
190### Choosing a different local model temporarily277### Choosing a different local model temporarily
191 278
194To start a new Codex CLI thread with a specific model or to specify the model for `codex exec` you can use the `--model`/`-m` flag:281To start a new Codex CLI thread with a specific model or to specify the model for `codex exec` you can use the `--model`/`-m` flag:
195 282
196```bash283```bash
197284codex -m gpt-5.3-codexcodex -m gpt-5.5
198```285```
199 286
200### Choosing your model for cloud tasks287### Choosing your model for cloud tasks