- GitHub releases now include a direct PowerShell installer for Windows. (#12741) - Experimental multi-agent mode can now be enabled from `/agent`, with faster subagent startup, approval prompts, and clearer subagent naming in the TUI. (#12935, #12995, #13218, #13246, #13249, #13404, #13412, #13460) - Added under-development plugin loading from `config.toml`, including bundled skills, MCP servers, and app connectors. (#12864, #13333, #13401) - Added under-development presentation and spreadsheet artifact tools for creating and editing `.pptx` and `.xlsx` content inside a thread. (#13341, #13344, #13345, #13347, #13348, #13350, #13355, #13357, #13360, #13362) - Added an under-development `/fast` toggle in the TUI, and expanded app-server service-tier support to `fast` and `flex`. (#13212, #13334, #13391) - Enterprise-managed deployments can now pin feature flags in `requirements.toml` instead of only suggesting defaults. (#13388) ## Bug Fixes - `@` file search now respects the repository boundary, so parent-directory `.gitignore` files no longer hide valid repo files. (#13250) - Read-only sandbox policies now preserve explicitly granted network access instead of dropping it during permission merging. (#13409) - Command-line config overrides once again merge correctly with trusted project-local MCP settings. (#13090) - macOS automation sessions can now launch target apps reliably under bundle-ID sandbox permissions. (#12989) - The `/feedback` action now opens the correct GitHub bug template instead of a blank form. (#13086) ## Chores - For app-server integrators, the protocol is now more firmly v2-first: deprecated v1 WebSocket/RPC surfaces were removed, typed `skills/changed` notifications were added, and a flat v2 schema bundle is now published for code generation. (#13324, #13364, #13375, #13397, #13414) ## Changelog Full Changelog: https://github.com/openai/codex/compare/rust-v0.107.0...rust-v0.108.0 - #13086 Fix CLI feedback link @etraut-openai - #13063 Make cloud_requirements fail close @alexsong-oai - #13083 Enable analytics in codex exec and codex mcp-server @etraut-openai - #12995 feat: approval for sub-agent in the TUI @jif-oai - #13027 feat: skill disable respect config layer @jif-oai - #13125 chore: change mem default @jif-oai - #13088 Tune memory read-path for stale facts @andi-oai - #13128 nit: ignore `resume_startup_does_not_consume_model_availability_nux_c… @jif-oai - #12935 Speed up subagent startup @daveaitel-openai - #13127 nit: disable on windows @jif-oai - #13129 fix: package `models.json` for Bazel tests @jif-oai - #13065 core: resolve host_executable() rules during preflight @bolinfest - #12864 feat: load from plugins @xl-openai - #12989 fix: MacOSAutomationPermission::BundleIDs should allow communicating … @leoshimo-oai - #13181 [codex] include plan type in account updates @tibo-openai - #13058 Record realtime close marker on replacement @aibrahim-oai - #13215 Fix issue deduplication workflow for Codex issues @etraut-openai - #13197 Improve subagent contrast in TUI @gabec-openai - #13130 fix: `/status` when sub-agent @jif-oai - #13008 feat: polluted memories @jif-oai - #13237 feat: update memories config names @jif-oai - #13131 fix: esc in `/agent` @jif-oai - #13052 core: reuse parent shell snapshot for thread-spawn subagents @daveaitel-openai - #13249 chore: `/multiagent` alias for `/agent` @jif-oai - #13057 fix: use https://git.savannah.gnu.org/git/bash instead of https://github.com/bolinfest/bash @bolinfest - #13090 Fix project trust config parsing so CLI overrides work @etraut-openai - #13202 tui: restore draft footer hints @charley-oai - #13246 feat: enable ma through `/agent` @jif-oai - #12642 fix(core) shell_snapshot multiline exports @dylan-hurd-oai - #11814 test(app-server): increase flow test timeout to reduce flake @joshka-oai - #13282 app-server: Update `thread/name/set` to support not-loaded threads @euroelessar - #13285 feat(app-server): add tracing to all app-server APIs @owenlin0 - #13265 Update realtime websocket API @aibrahim-oai - #13261 fix(app-server): emit turn/started only when turn actually starts @owenlin0 - #13079 app-server: Silence thread status changes caused by thread being created @euroelessar - #13284 Adjusting plan prompt for clarity and verbosity @bfioca-openai - #13286 feat(app-server-test-client): support tracing @owenlin0 - #13061 chore: remove SkillMetadata.permissions and derive skill sandboxing from permission_profile @celia-oai - #12006 tui: preserve kill buffer across submit and slash-command clears @rakan-oai - #13212 add fast mode toggle @pash-openai - #13250 fix(core): scope file search gitignore to repository context @fcoury - #13313 Renaming Team to Business plan during TUI onboarding @bwanner-oai - #13248 fix: agent race @jif-oai - #13235 fix: agent when profile @jif-oai - #13336 fix: db windows path @jif-oai - #13334 app-server service tier plumbing (plus some cleanup) @pash-openai - #13341 feat: presentation artifact p1 @jif-oai - #13344 feat: pres artifact 2 @jif-oai - #13346 feat: pres artifact 3 @jif-oai - #13345 feat: spreadsheet artifact @jif-oai - #13347 feat: spreadsheet v2 @jif-oai - #13348 feat: presentation part 4 @jif-oai - #13350 feat: spreadsheet part 3 @jif-oai - #13355 feat: pres artifact part 5 @jif-oai - #13357 feat: add multi-actions to presentation tool @jif-oai - #13360 feat: artifact presentation part 7 @jif-oai - #13362 feat: wire spreadsheet artifact @jif-oai - #12741 Add Windows direct install script @efrazer-oai - #13376 realtime prompt changes @aibrahim-oai - #13324 app-server-protocol: export flat v2 schema bundle @apanasenko-oai - #13364 Remove Responses V1 websocket implementation @pakrym-oai - #12969 app-server: source /feedback logs from sqlite at trace level @charley-oai - #13381 chore: rm --all-features flag from rust-analyzer @sayan-oai - #13043 Collapse parsed command summaries when any stage is unknown @nornagon-openai - #13385 Revert "realtime prompt changes" @aibrahim-oai - #13389 fix @aibrahim-oai - #13375 chore(app-server): delete v1 RPC methods and notifications @owenlin0 - #13397 chore(app-server): restore EventMsg TS types @owenlin0 - #13395 Build delegated realtime handoff text from all messages @aibrahim-oai - #13399 Require deduplicator success before commenting @etraut-openai - #13398 Revert "Revert "realtime prompt changes"" @aibrahim-oai - #13333 Refactor plugin config and cache path @xl-openai - #13275 fix(network-proxy): reject mismatched host headers @viyatb-oai - #12868 tui: align pending steers with core acceptance @charley-oai - #13280 Add thread metadata update endpoint to app server @joeytrasatti-openai - #13050 Add under-development original-resolution view_image support @fjord-oai - #13402 Ensure the env values of imported shell_environment_policy.set is string @alexsong-oai - #13331 Make js_repl image output controllable @fjord-oai - #13401 feat: load plugin apps @sayan-oai - #13292 [feedback] diagnostics @rhan-oai - #13414 feat(app-server): add a skills/changed v2 notification @owenlin0 - #13368 feat(app-server): propagate app-server trace context into core @owenlin0 - #13413 copy command-runner to CODEX_HOME so sandbox users can always execute it @iceweasel-oai - #13366 [bazel] Bump rules_rs and llvm @zbarsky-openai - #13409 Feat: Preserve network access on read-only sandbox policies @celia-oai - #13388 config: enforce enterprise feature requirements @bolinfest - #13218 Add role-specific subagent nickname overrides @gabec-openai - #13427 chore: Nest skill and protocol network permissions under `network.enabled` @celia-oai - #13429 core: box wrapper futures to reduce stack pressure @bolinfest - #13391 support 'flex' tier in app-server in addition to 'fast' @kharvd - #13290 image-gen-core @won-openai - #13404 feat: better multi-agent prompt @jif-oai - #13412 feat: ordinal nick name @jif-oai - #13454 add metric for per-turn token usage @jif-oai - #13240 fix: pending messages in `/agent` @jif-oai - #13461 fix: bad merge @jif-oai - #13460 feat: disable request input on sub agent @jif-oai - #13456 feat: add metric for per-turn tool count and add tmp_mem flag @jif-oai - #13468 nit: citation prompt @jif-oai - #13467 feat: memories in workspace write @jif-oai
Codex CLI (Rust Implementation)
We provide Codex CLI as a standalone, native executable to ensure a zero-dependency install.
Installing Codex
Today, the easiest way to install Codex is via npm:
npm i -g @openai/codex
codex
You can also install via Homebrew (brew install --cask codex) or download a platform-specific release directly from our GitHub Releases.
Documentation quickstart
- First run with Codex? Start with
docs/getting-started.md(links to the walkthrough for prompts, keyboard shortcuts, and session management). - Want deeper control? See
docs/config.mdanddocs/install.md.
What's new in the Rust CLI
The Rust implementation is now the maintained Codex CLI and serves as the default experience. It includes a number of features that the legacy TypeScript CLI never supported.
Config
Codex supports a rich set of configuration options. Note that the Rust CLI uses config.toml instead of config.json. See docs/config.md for details.
Model Context Protocol Support
MCP client
Codex CLI functions as an MCP client that allows the Codex CLI and IDE extension to connect to MCP servers on startup. See the configuration documentation for details.
MCP server (experimental)
Codex can be launched as an MCP server by running codex mcp-server. This allows other MCP clients to use Codex as a tool for another agent.
Use the @modelcontextprotocol/inspector to try it out:
npx @modelcontextprotocol/inspector codex mcp-server
Use codex mcp to add/list/get/remove MCP server launchers defined in config.toml, and codex mcp-server to run the MCP server directly.
Notifications
You can enable notifications by configuring a script that is run whenever the agent finishes a turn. The notify documentation includes a detailed example that explains how to get desktop notifications via terminal-notifier on macOS. When Codex detects that it is running under WSL 2 inside Windows Terminal (WT_SESSION is set), the TUI automatically falls back to native Windows toast notifications so approval prompts and completed turns surface even though Windows Terminal does not implement OSC 9.
codex exec to run Codex programmatically/non-interactively
To run Codex non-interactively, run codex exec PROMPT (you can also pass the prompt via stdin) and Codex will work on your task until it decides that it is done and exits. Output is printed to the terminal directly. You can set the RUST_LOG environment variable to see more about what's going on.
Use codex exec --ephemeral ... to run without persisting session rollout files to disk.
Experimenting with the Codex Sandbox
To test to see what happens when a command is run under the sandbox provided by Codex, we provide the following subcommands in Codex CLI:
# macOS
codex sandbox macos [--full-auto] [--log-denials] [COMMAND]...
# Linux
codex sandbox linux [--full-auto] [COMMAND]...
# Windows
codex sandbox windows [--full-auto] [COMMAND]...
# Legacy aliases
codex debug seatbelt [--full-auto] [--log-denials] [COMMAND]...
codex debug landlock [--full-auto] [COMMAND]...
Selecting a sandbox policy via --sandbox
The Rust CLI exposes a dedicated --sandbox (-s) flag that lets you pick the sandbox policy without having to reach for the generic -c/--config option:
# Run Codex with the default, read-only sandbox
codex --sandbox read-only
# Allow the agent to write within the current workspace while still blocking network access
codex --sandbox workspace-write
# Danger! Disable sandboxing entirely (only do this if you are already running in a container or other isolated env)
codex --sandbox danger-full-access
The same setting can be persisted in ~/.codex/config.toml via the top-level sandbox_mode = "MODE" key, e.g. sandbox_mode = "workspace-write".
In workspace-write, Codex also includes ~/.codex/memories in its writable roots so memory maintenance does not require an extra approval.
Code Organization
This folder is the root of a Cargo workspace. It contains quite a bit of experimental code, but here are the key crates:
core/contains the business logic for Codex. Ultimately, we hope this to be a library crate that is generally useful for building other Rust/native applications that use Codex.exec/"headless" CLI for use in automation.tui/CLI that launches a fullscreen TUI built with Ratatui.cli/CLI multitool that provides the aforementioned CLIs via subcommands.
If you want to contribute or inspect behavior in detail, start by reading the module-level README.md files under each crate and run the project workspace from the top-level codex-rs directory so shared config, features, and build scripts stay aligned.