- GitHub releases now include a direct PowerShell installer for Windows. (#12741) - Experimental multi-agent mode can now be enabled from `/agent`, with faster subagent startup, approval prompts, and clearer subagent naming in the TUI. (#12935, #12995, #13218, #13246, #13249, #13404, #13412, #13460) - Added under-development plugin loading from `config.toml`, including bundled skills, MCP servers, and app connectors. (#12864, #13333, #13401) - Added under-development presentation and spreadsheet artifact tools for creating and editing `.pptx` and `.xlsx` content inside a thread. (#13341, #13344, #13345, #13347, #13348, #13350, #13355, #13357, #13360, #13362) - Added an under-development `/fast` toggle in the TUI, and expanded app-server service-tier support to `fast` and `flex`. (#13212, #13334, #13391) - Enterprise-managed deployments can now pin feature flags in `requirements.toml` instead of only suggesting defaults. (#13388) ## Bug Fixes - `@` file search now respects the repository boundary, so parent-directory `.gitignore` files no longer hide valid repo files. (#13250) - Read-only sandbox policies now preserve explicitly granted network access instead of dropping it during permission merging. (#13409) - Command-line config overrides once again merge correctly with trusted project-local MCP settings. (#13090) - macOS automation sessions can now launch target apps reliably under bundle-ID sandbox permissions. (#12989) - The `/feedback` action now opens the correct GitHub bug template instead of a blank form. (#13086) ## Chores - For app-server integrators, the protocol is now more firmly v2-first: deprecated v1 WebSocket/RPC surfaces were removed, typed `skills/changed` notifications were added, and a flat v2 schema bundle is now published for code generation. (#13324, #13364, #13375, #13397, #13414) ## Changelog Full Changelog: https://github.com/openai/codex/compare/rust-v0.107.0...rust-v0.108.0 - #13086 Fix CLI feedback link @etraut-openai - #13063 Make cloud_requirements fail close @alexsong-oai - #13083 Enable analytics in codex exec and codex mcp-server @etraut-openai - #12995 feat: approval for sub-agent in the TUI @jif-oai - #13027 feat: skill disable respect config layer @jif-oai - #13125 chore: change mem default @jif-oai - #13088 Tune memory read-path for stale facts @andi-oai - #13128 nit: ignore `resume_startup_does_not_consume_model_availability_nux_c… @jif-oai - #12935 Speed up subagent startup @daveaitel-openai - #13127 nit: disable on windows @jif-oai - #13129 fix: package `models.json` for Bazel tests @jif-oai - #13065 core: resolve host_executable() rules during preflight @bolinfest - #12864 feat: load from plugins @xl-openai - #12989 fix: MacOSAutomationPermission::BundleIDs should allow communicating … @leoshimo-oai - #13181 [codex] include plan type in account updates @tibo-openai - #13058 Record realtime close marker on replacement @aibrahim-oai - #13215 Fix issue deduplication workflow for Codex issues @etraut-openai - #13197 Improve subagent contrast in TUI @gabec-openai - #13130 fix: `/status` when sub-agent @jif-oai - #13008 feat: polluted memories @jif-oai - #13237 feat: update memories config names @jif-oai - #13131 fix: esc in `/agent` @jif-oai - #13052 core: reuse parent shell snapshot for thread-spawn subagents @daveaitel-openai - #13249 chore: `/multiagent` alias for `/agent` @jif-oai - #13057 fix: use https://git.savannah.gnu.org/git/bash instead of https://github.com/bolinfest/bash @bolinfest - #13090 Fix project trust config parsing so CLI overrides work @etraut-openai - #13202 tui: restore draft footer hints @charley-oai - #13246 feat: enable ma through `/agent` @jif-oai - #12642 fix(core) shell_snapshot multiline exports @dylan-hurd-oai - #11814 test(app-server): increase flow test timeout to reduce flake @joshka-oai - #13282 app-server: Update `thread/name/set` to support not-loaded threads @euroelessar - #13285 feat(app-server): add tracing to all app-server APIs @owenlin0 - #13265 Update realtime websocket API @aibrahim-oai - #13261 fix(app-server): emit turn/started only when turn actually starts @owenlin0 - #13079 app-server: Silence thread status changes caused by thread being created @euroelessar - #13284 Adjusting plan prompt for clarity and verbosity @bfioca-openai - #13286 feat(app-server-test-client): support tracing @owenlin0 - #13061 chore: remove SkillMetadata.permissions and derive skill sandboxing from permission_profile @celia-oai - #12006 tui: preserve kill buffer across submit and slash-command clears @rakan-oai - #13212 add fast mode toggle @pash-openai - #13250 fix(core): scope file search gitignore to repository context @fcoury - #13313 Renaming Team to Business plan during TUI onboarding @bwanner-oai - #13248 fix: agent race @jif-oai - #13235 fix: agent when profile @jif-oai - #13336 fix: db windows path @jif-oai - #13334 app-server service tier plumbing (plus some cleanup) @pash-openai - #13341 feat: presentation artifact p1 @jif-oai - #13344 feat: pres artifact 2 @jif-oai - #13346 feat: pres artifact 3 @jif-oai - #13345 feat: spreadsheet artifact @jif-oai - #13347 feat: spreadsheet v2 @jif-oai - #13348 feat: presentation part 4 @jif-oai - #13350 feat: spreadsheet part 3 @jif-oai - #13355 feat: pres artifact part 5 @jif-oai - #13357 feat: add multi-actions to presentation tool @jif-oai - #13360 feat: artifact presentation part 7 @jif-oai - #13362 feat: wire spreadsheet artifact @jif-oai - #12741 Add Windows direct install script @efrazer-oai - #13376 realtime prompt changes @aibrahim-oai - #13324 app-server-protocol: export flat v2 schema bundle @apanasenko-oai - #13364 Remove Responses V1 websocket implementation @pakrym-oai - #12969 app-server: source /feedback logs from sqlite at trace level @charley-oai - #13381 chore: rm --all-features flag from rust-analyzer @sayan-oai - #13043 Collapse parsed command summaries when any stage is unknown @nornagon-openai - #13385 Revert "realtime prompt changes" @aibrahim-oai - #13389 fix @aibrahim-oai - #13375 chore(app-server): delete v1 RPC methods and notifications @owenlin0 - #13397 chore(app-server): restore EventMsg TS types @owenlin0 - #13395 Build delegated realtime handoff text from all messages @aibrahim-oai - #13399 Require deduplicator success before commenting @etraut-openai - #13398 Revert "Revert "realtime prompt changes"" @aibrahim-oai - #13333 Refactor plugin config and cache path @xl-openai - #13275 fix(network-proxy): reject mismatched host headers @viyatb-oai - #12868 tui: align pending steers with core acceptance @charley-oai - #13280 Add thread metadata update endpoint to app server @joeytrasatti-openai - #13050 Add under-development original-resolution view_image support @fjord-oai - #13402 Ensure the env values of imported shell_environment_policy.set is string @alexsong-oai - #13331 Make js_repl image output controllable @fjord-oai - #13401 feat: load plugin apps @sayan-oai - #13292 [feedback] diagnostics @rhan-oai - #13414 feat(app-server): add a skills/changed v2 notification @owenlin0 - #13368 feat(app-server): propagate app-server trace context into core @owenlin0 - #13413 copy command-runner to CODEX_HOME so sandbox users can always execute it @iceweasel-oai - #13366 [bazel] Bump rules_rs and llvm @zbarsky-openai - #13409 Feat: Preserve network access on read-only sandbox policies @celia-oai - #13388 config: enforce enterprise feature requirements @bolinfest - #13218 Add role-specific subagent nickname overrides @gabec-openai - #13427 chore: Nest skill and protocol network permissions under `network.enabled` @celia-oai - #13429 core: box wrapper futures to reduce stack pressure @bolinfest - #13391 support 'flex' tier in app-server in addition to 'fast' @kharvd - #13290 image-gen-core @won-openai - #13404 feat: better multi-agent prompt @jif-oai - #13412 feat: ordinal nick name @jif-oai - #13454 add metric for per-turn token usage @jif-oai - #13240 fix: pending messages in `/agent` @jif-oai - #13461 fix: bad merge @jif-oai - #13460 feat: disable request input on sub agent @jif-oai - #13456 feat: add metric for per-turn tool count and add tmp_mem flag @jif-oai - #13468 nit: citation prompt @jif-oai - #13467 feat: memories in workspace write @jif-oai
npm i -g @openai/codex
or brew install --cask codex
Codex CLI is a coding agent from OpenAI that runs locally on your computer.
If you want Codex in your code editor (VS Code, Cursor, Windsurf), install in your IDE.
If you want the desktop app experience, run
codex app or visit the Codex App page.
If you are looking for the cloud-based agent from OpenAI, Codex Web, go to chatgpt.com/codex.
Quickstart
Installing and running Codex CLI
Install globally with your preferred package manager:
# Install using npm
npm install -g @openai/codex
# Install using Homebrew
brew install --cask codex
Then simply run codex to get started.
You can also go to the latest GitHub Release and download the appropriate binary for your platform.
Each GitHub Release contains many executables, but in practice, you likely want one of these:
- macOS
- Apple Silicon/arm64:
codex-aarch64-apple-darwin.tar.gz - x86_64 (older Mac hardware):
codex-x86_64-apple-darwin.tar.gz
- Apple Silicon/arm64:
- Linux
- x86_64:
codex-x86_64-unknown-linux-musl.tar.gz - arm64:
codex-aarch64-unknown-linux-musl.tar.gz
- x86_64:
Each archive contains a single entry with the platform baked into the name (e.g., codex-x86_64-unknown-linux-musl), so you likely want to rename it to codex after extracting it.
Using Codex with your ChatGPT plan
Run codex and select Sign in with ChatGPT. We recommend signing into your ChatGPT account to use Codex as part of your Plus, Pro, Team, Edu, or Enterprise plan. Learn more about what's included in your ChatGPT plan.
You can also use Codex with an API key, but this requires additional setup.
Docs
This repository is licensed under the Apache-2.0 License.
