sdk.md +50 −2
11 11
12## TypeScript library12## TypeScript library
13 13
1414The TypeScript library provides a way to control Codex from within your application that is more comprehensive and flexible than non-interactive mode.The TypeScript library provides a way to control Codex from within your application that's more comprehensive and flexible than non-interactive mode.
15 15
16Use the library server-side; it requires Node.js 18 or later.16Use the library server-side; it requires Node.js 18 or later.
17 17
28Start a thread with Codex and run it with your prompt.28Start a thread with Codex and run it with your prompt.
29 29
30```ts30```ts
3131import { Codex } from "@openai/codex-sdk";
32 32
33const codex = new Codex();33const codex = new Codex();
34const thread = codex.startThread();34const thread = codex.startThread();
57```57```
58 58
59For more details, check out the [TypeScript repo](https://github.com/openai/codex/tree/main/sdk/typescript).59For more details, check out the [TypeScript repo](https://github.com/openai/codex/tree/main/sdk/typescript).
60
61## Python library
62
63The Python SDK is experimental and controls the local Codex app-server over JSON-RPC. It requires Python 3.10 or later and a local checkout of the open-source Codex repo.
64
65### Installation
66
67From the Codex repo root, install the SDK in editable mode:
68
69```bash
70cd sdk/python
71python -m pip install -e .
72```
73
74For manual local SDK usage, pass `AppServerConfig(codex_bin=...)` to point at a local `codex` binary, or use the repo examples and notebook bootstrap.
75
76### Usage
77
78Start Codex, create a thread, and run a prompt:
79
80```python
81from codex_app_server import Codex
82
83with Codex() as codex:
84 thread = codex.thread_start(model="gpt-5.4")
85 result = thread.run("Make a plan to diagnose and fix the CI failures")
86 print(result.final_response)
87```
88
89Use `AsyncCodex` when your application is already asynchronous:
90
91```python
92import asyncio
93
94from codex_app_server import AsyncCodex
95
96
97async def main() -> None:
98 async with AsyncCodex() as codex:
99 thread = await codex.thread_start(model="gpt-5.4")
100 result = await thread.run("Implement the plan")
101 print(result.final_response)
102
103
104asyncio.run(main())
105```
106
107For more details, check out the [Python repo](https://github.com/openai/codex/tree/main/sdk/python).