Compare commits

...

5 Commits

Author SHA1 Message Date
Felipe Coury
7be5328f34 feat: add codex web terminal 2026-05-02 17:16:20 -03:00
pakrym-oai
35aaa5d9fc Bound websocket request sends with idle timeout (#20751)
## Why

We saw Responses websocket sessions recover only after a long quiet
period when the server had already logged the websocket as disconnected.
The normal connect path is already bounded by
`websocket_connect_timeout_ms`, but the first request send on an
established websocket reused only the receive-side idle timeout after
the write completed. If the socket write/pump stalls, the client can sit
in `ws_stream.send(...)` without reaching the existing receive timeout.
2026-05-01 23:33:32 -07:00
Matthew Zeng
f88701f5c8 [tool_suggest] More prompt polishes. (#20566)
Tool suggest still misfires when model needs tool_search, updating the
prompts to further disambiguate it:

- [x] rename it from `tool_suggest` to `request_plugin_install`
- [x] rephrase "suggestion" to "install" in the tool descriptions.
- [x] disambiguate "the tool" vs "the plugin/connector". 

Tested with the Codex App and verified it still works.
2026-05-02 04:22:12 +00:00
Felipe Coury
127434cd8b fix(tui): bound startup terminal probes (#20654)
## Summary

Bound TUI startup terminal response probes so unsupported terminals
cannot stall startup for multiple seconds.

This replaces the Unix startup uses of crossterm's blocking response
probes with short `/dev/tty` probes that use nonblocking reads and
`poll` with a 100ms timeout. It covers the initial cursor-position
query, keyboard enhancement support detection, and OSC 10/11
default-color detection. The default-color probe uses one shared
deadline for foreground and background instead of allowing two
independent full waits.

The diagnostic mode/trace env vars from the investigation branch are
intentionally not included. The shipped behavior is simply bounded
probing by default, while non-Unix keeps the existing crossterm fallback
path.

## Details

- Add a private `terminal_probe` module for bounded Unix terminal probes
and response parsers.
- Let `custom_terminal::Terminal` accept a caller-provided initial
cursor position so startup can compute it before constructing the
terminal.
- Use bounded cursor, keyboard enhancement, and default-color probes on
Unix startup.
- Preserve default-color cache behavior so a failed attempted query does
not retry forever.

## Validation

- `cd codex-rs && just fmt`
- `cd codex-rs && cargo test -p codex-tui terminal_probe`
- `cd codex-rs && just fix -p codex-tui`
- `cd codex-rs && just argument-comment-lint`
- `git diff --check`
- `git diff --cached --check`

`cd codex-rs && cargo test -p codex-tui` still aborts on the
pre-existing local stack overflow in
`app::tests::discard_side_thread_keeps_local_state_when_server_close_fails`;
I reproduced that same focused failure on `main` before this PR work, so
it is not introduced by this change.

Manual validation in the VM showed the original crossterm path taking
about 2s per unanswered probe, while bounded probing returned in about
100ms per probe.
2026-05-02 01:20:57 +00:00
jgershen-oai
9e905528bb Fix custom CA login behind TLS-inspecting proxies (#20676)
Refs:
https://linear.app/openai/issue/SE-6311/login-fails-for-experian-users-behind-tls-inspecting-proxy

## Summary
- When a custom CA bundle is configured, force the shared `codex-client`
reqwest builder onto rustls before registering custom roots.
- Add the `rustls-tls-native-roots` reqwest feature so the rustls client
preserves native roots plus the enterprise CA bundle.
- Add subprocess TLS coverage for both a direct local TLS 1.3 server and
a hermetic local CONNECT TLS-intercepting proxy that forwards a
token-exchange-shaped POST to a local origin.

## Plain-language explanation
Experian users are behind a TLS-inspecting proxy, so the login token
exchange needs to trust the enterprise CA bundle from
`CODEX_CA_CERTIFICATE` or `SSL_CERT_FILE`. Before this change, that
custom-CA branch still used reqwest default TLS selection, which could
fail in the proxy environment. Now, only when a custom CA is configured,
Codex selects rustls first and then adds the custom CA roots, matching
the validated behavior from the Experian test build while leaving normal
system-root clients unchanged.

The new regression test recreates the enterprise-proxy shape locally:
the probe client sends an HTTPS `POST /oauth/token` through an explicit
HTTP CONNECT proxy, the proxy presents a leaf certificate signed by a
runtime-generated test CA, decrypts the request, forwards it to a local
origin, and relays the `ok` response back.

## Scope note
- The actual production fix is the first commit: `8368119282 Fix custom
CA reqwest clients to use rustls`.
- The second commit is integration-test coverage only. It generates all
test CA and localhost certificate material at runtime.

## Validation
- `cd codex-rs && cargo test -p codex-client --test ca_env
posts_to_token_origin_through_tls_intercepting_proxy_with_custom_ca_bundle
-- --nocapture`
- `cd codex-rs && cargo test -p codex-client`
- `cd codex-rs && cargo test -p codex-login`
- `cd codex-rs && just fmt`
- `cd codex-rs && just bazel-lock-update`
- `cd codex-rs && just bazel-lock-check`
- `cd codex-rs && just fix -p codex-client`
2026-05-01 17:51:49 -07:00
54 changed files with 3589 additions and 319 deletions

20
codex-rs/Cargo.lock generated
View File

@@ -2188,6 +2188,7 @@ dependencies = [
"codex-utils-cargo-bin",
"codex-utils-cli",
"codex-utils-path",
"codex-web-server",
"codex-windows-sandbox",
"libc",
"owo-colors",
@@ -2220,6 +2221,7 @@ dependencies = [
"opentelemetry_sdk",
"pretty_assertions",
"rand 0.9.3",
"rcgen",
"reqwest",
"rustls",
"rustls-native-certs",
@@ -3935,6 +3937,24 @@ dependencies = [
"v8",
]
[[package]]
name = "codex-web-server"
version = "0.0.0"
dependencies = [
"anyhow",
"axum",
"clap",
"codex-utils-pty",
"futures",
"include_dir",
"pretty_assertions",
"tokio",
"tokio-tungstenite",
"tracing",
"url",
"webbrowser",
]
[[package]]
name = "codex-windows-sandbox"
version = "0.0.0"

View File

@@ -104,6 +104,7 @@ members = [
"thread-store",
"uds",
"codex-experimental-api-macros",
"web-server",
"plugin",
"model-provider",
]
@@ -223,6 +224,7 @@ codex-utils-stream-parser = { path = "utils/stream-parser" }
codex-utils-string = { path = "utils/string" }
codex-utils-template = { path = "utils/template" }
codex-v8-poc = { path = "v8-poc" }
codex-web-server = { path = "web-server" }
codex-windows-sandbox = { path = "windows-sandbox-rs" }
core_test_support = { path = "core/tests/common" }
mcp_test_support = { path = "mcp-server/tests/common" }
@@ -322,6 +324,10 @@ quick-xml = "0.38.4"
rand = "0.9"
ratatui = "0.29.0"
ratatui-macros = "0.6.0"
rcgen = { version = "0.14.7", default-features = false, features = [
"aws_lc_rs",
"pem",
] }
regex = "1.12.3"
regex-lite = "0.1.8"
reqwest = { version = "0.12", features = ["cookies"] }

View File

@@ -50,6 +50,7 @@ codex-terminal-detection = { workspace = true }
codex-tui = { workspace = true }
codex-utils-absolute-path = { workspace = true }
codex-utils-path = { workspace = true }
codex-web-server = { workspace = true }
libc = { workspace = true }
owo-colors = { workspace = true }
regex-lite = { workspace = true }

View File

@@ -126,6 +126,9 @@ enum Subcommand {
/// [experimental] Run the app server or related tooling.
AppServer(AppServerCommand),
/// Serve Codex in a browser-backed terminal.
Web(codex_web_server::WebCommand),
/// Launch the Codex desktop app (opens the app installer if missing).
#[cfg(any(target_os = "macos", target_os = "windows"))]
App(app_cmd::AppCommand),
@@ -888,6 +891,14 @@ async fn cli_main(arg0_paths: Arg0DispatchPaths) -> anyhow::Result<()> {
}
}
}
Some(Subcommand::Web(web_cli)) => {
reject_remote_mode_for_subcommand(
root_remote.as_deref(),
root_remote_auth_token_env.as_deref(),
"web",
)?;
codex_web_server::run(web_cli, root_config_overrides.raw_overrides.clone()).await?;
}
#[cfg(any(target_os = "macos", target_os = "windows"))]
Some(Subcommand::App(app_cli)) => {
reject_remote_mode_for_subcommand(
@@ -2449,6 +2460,30 @@ mod tests {
);
}
#[test]
fn web_command_parses_forwarded_args() {
let cli = MultitoolCli::try_parse_from([
"codex",
"web",
"--listen",
"127.0.0.1:4321",
"--",
"--model",
"gpt-test",
])
.expect("parse");
let Some(Subcommand::Web(web)) = cli.subcommand else {
panic!("expected web subcommand");
};
assert_eq!(
web.listen,
"127.0.0.1:4321"
.parse::<std::net::SocketAddr>()
.expect("socket address should parse")
);
assert_eq!(web.codex_args, vec!["--model", "gpt-test"]);
}
#[test]
fn reject_remote_auth_token_env_for_app_server_proxy() {
let subcommand = AppServerSubcommand::Proxy(AppServerProxyCommand { socket_path: None });

View File

@@ -556,10 +556,15 @@ async fn run_websocket_response_stream(
trace!("websocket request: {request_text}");
let request_start = Instant::now();
let result = ws_stream
.send(Message::Text(request_text.into()))
.await
.map_err(|err| ApiError::Stream(format!("failed to send websocket request: {err}")));
let result = tokio::time::timeout(
idle_timeout,
ws_stream.send(Message::Text(request_text.into())),
)
.await
.map_err(|_| ApiError::Stream("idle timeout sending websocket request".into()))
.and_then(|result| {
result.map_err(|err| ApiError::Stream(format!("failed to send websocket request: {err}")))
});
if let Some(t) = telemetry.as_ref() {
t.on_ws_request(

View File

@@ -12,7 +12,7 @@ futures = { workspace = true }
http = { workspace = true }
opentelemetry = { workspace = true }
rand = { workspace = true }
reqwest = { workspace = true, features = ["json", "stream"] }
reqwest = { workspace = true, features = ["json", "rustls-tls-native-roots", "stream"] }
rustls = { workspace = true }
rustls-native-certs = { workspace = true }
rustls-pki-types = { workspace = true }
@@ -32,5 +32,6 @@ workspace = true
codex-utils-cargo-bin = { workspace = true }
opentelemetry_sdk = { workspace = true }
pretty_assertions = { workspace = true }
rcgen = { workspace = true }
tempfile = { workspace = true }
tracing-subscriber = { workspace = true }

View File

@@ -8,22 +8,93 @@
//! - env precedence is respected,
//! - multi-cert PEM bundles load,
//! - error messages guide users when CA files are invalid.
//! - optional HTTPS probes can complete a request through the constructed client.
//!
//! The detailed explanation of what "hermetic" means here lives in `codex_client::custom_ca`.
//! This binary exists so the tests can exercise
//! [`codex_client::build_reqwest_client_for_subprocess_tests`] in a separate process without
//! duplicating client-construction logic.
use std::env;
use std::process;
use std::time::Duration;
const PROBE_TLS13_ENV: &str = "CODEX_CUSTOM_CA_PROBE_TLS13";
const PROBE_PROXY_ENV: &str = "CODEX_CUSTOM_CA_PROBE_PROXY";
const PROBE_URL_ENV: &str = "CODEX_CUSTOM_CA_PROBE_URL";
fn main() {
match codex_client::build_reqwest_client_for_subprocess_tests(reqwest::Client::builder()) {
Ok(_) => {
println!("ok");
let runtime = match tokio::runtime::Builder::new_current_thread()
.enable_all()
.build()
{
Ok(runtime) => runtime,
Err(error) => {
eprintln!("failed to create probe runtime: {error}");
process::exit(1);
}
};
match runtime.block_on(run_probe()) {
Ok(()) => println!("ok"),
Err(error) => {
eprintln!("{error}");
process::exit(1);
}
}
}
async fn run_probe() -> Result<(), String> {
let proxy_url = env::var(PROBE_PROXY_ENV).ok();
let target_url = env::var(PROBE_URL_ENV).ok();
let mut builder = reqwest::Client::builder();
if target_url.is_some() {
builder = builder.timeout(Duration::from_secs(5));
}
if env::var_os(PROBE_TLS13_ENV).is_some() {
builder = builder.min_tls_version(reqwest::tls::Version::TLS_1_3);
}
let client = build_probe_client(builder, proxy_url.as_deref())?;
if let Some(url) = target_url {
post_probe_request(&client, &url).await?;
}
Ok(())
}
fn build_probe_client(
builder: reqwest::ClientBuilder,
proxy_url: Option<&str>,
) -> Result<reqwest::Client, String> {
if let Some(proxy_url) = proxy_url {
let proxy = reqwest::Proxy::https(proxy_url)
.map_err(|error| format!("failed to configure probe proxy {proxy_url}: {error}"))?;
return codex_client::build_reqwest_client_with_custom_ca(builder.proxy(proxy))
.map_err(|error| error.to_string());
}
codex_client::build_reqwest_client_for_subprocess_tests(builder)
.map_err(|error| error.to_string())
}
async fn post_probe_request(client: &reqwest::Client, url: &str) -> Result<(), String> {
let response = client
.post(url)
.header("Content-Type", "application/x-www-form-urlencoded")
.body("grant_type=authorization_code&code=test")
.send()
.await
.map_err(|error| format!("probe request failed: {error:?}"))?;
let status = response.status();
let body = response
.text()
.await
.map_err(|error| format!("failed to read probe response body: {error}"))?;
if !status.is_success() {
return Err(format!("probe request returned {status}: {body}"));
}
if body != "ok" {
return Err(format!("probe response body mismatch: {body}"));
}
Ok(())
}

View File

@@ -14,10 +14,9 @@
//! `TRUSTED CERTIFICATE` labels and bundles that also contain CRLs
//! - return user-facing errors that explain how to fix misconfigured CA files
//!
//! It does not validate certificate chains or perform a handshake in tests. Its contract is
//! narrower: produce a transport configuration whose root store contains every parseable
//! certificate block from the configured PEM bundle, or fail early with a precise error before
//! the caller starts network traffic.
//! Its production contract is narrow: produce a transport configuration whose root store contains
//! every parseable certificate block from the configured PEM bundle, or fail early with a precise
//! error before the caller starts network traffic.
//!
//! In this module's test setup, a hermetic test is one whose result depends only on the CA file
//! and environment variables that the test chose for itself. That matters here because the normal
@@ -36,7 +35,8 @@
//! - unit tests in this module cover env-selection logic without constructing a real client
//! - subprocess integration tests under `tests/` cover real client construction through
//! [`build_reqwest_client_for_subprocess_tests`], which disables reqwest proxy autodetection so
//! the tests can observe custom-CA success and failure directly
//! the tests can observe custom-CA success and failure directly, including one TLS handshake
//! through a local HTTPS server
//! - those subprocess tests also scrub inherited CA environment variables before launch so their
//! result depends only on the test fixtures and env vars set by the test itself
@@ -266,12 +266,21 @@ fn maybe_build_rustls_client_config_with_env(
/// This exists so tests can exercise precedence behavior deterministically without mutating the
/// real process environment. It selects the CA bundle, delegates file parsing to
/// [`ConfiguredCaBundle::load_certificates`], preserves the caller's chosen `reqwest` builder
/// configuration, and finally registers each parsed certificate with that builder.
/// configuration, forces rustls when a custom CA is configured, and finally registers each parsed
/// certificate with that builder.
fn build_reqwest_client_with_env(
env_source: &dyn EnvSource,
mut builder: reqwest::ClientBuilder,
) -> Result<reqwest::Client, BuildCustomCaTransportError> {
if let Some(bundle) = env_source.configured_ca_bundle() {
ensure_rustls_crypto_provider();
info!(
source_env = bundle.source_env,
ca_path = %bundle.path.display(),
"building HTTP client with rustls backend for custom CA bundle"
);
builder = builder.use_rustls_tls();
let certificates = bundle.load_certificates()?;
for (idx, cert) in certificates.iter().enumerate() {

View File

@@ -4,24 +4,83 @@
//! `build_reqwest_client_for_subprocess_tests` instead of calling the helper in-process. The
//! detailed explanation of what "hermetic" means here lives in `codex_client::custom_ca`; these
//! tests add the process-level half of that contract by scrubbing inherited CA environment
//! variables before each subprocess launch. They still stop at client construction: the
//! assertions here cover CA file selection, PEM parsing, and user-facing errors, not a full TLS
//! handshake.
//! variables before each subprocess launch. Most assertions here cover CA file selection, PEM
//! parsing, and user-facing errors. The HTTPS probes go further and perform real POSTs against
//! locally generated certificates, including through a TLS-intercepting CONNECT proxy.
use codex_utils_cargo_bin::cargo_bin;
use rcgen::BasicConstraints;
use rcgen::CertificateParams;
use rcgen::CertifiedIssuer;
use rcgen::DistinguishedName;
use rcgen::DnType;
use rcgen::ExtendedKeyUsagePurpose;
use rcgen::IsCa;
use rcgen::KeyPair;
use rcgen::KeyUsagePurpose;
use rcgen::PKCS_ECDSA_P256_SHA256;
use rustls_pki_types::CertificateDer;
use rustls_pki_types::PrivateKeyDer;
use std::fs;
use std::io;
use std::io::Read;
use std::io::Write;
use std::net::TcpListener;
use std::net::TcpStream;
use std::path::Path;
use std::path::PathBuf;
use std::process::Command;
use std::sync::Arc;
use std::sync::mpsc;
use std::thread;
use std::time::Duration;
use std::time::Instant;
use tempfile::TempDir;
const CODEX_CA_CERT_ENV: &str = "CODEX_CA_CERTIFICATE";
const PROBE_PROXY_ENV: &str = "CODEX_CUSTOM_CA_PROBE_PROXY";
const PROBE_TLS13_ENV: &str = "CODEX_CUSTOM_CA_PROBE_TLS13";
const PROBE_URL_ENV: &str = "CODEX_CUSTOM_CA_PROBE_URL";
const SSL_CERT_FILE_ENV: &str = "SSL_CERT_FILE";
const PROXY_ENV_VARS: &[&str] = &[
"HTTP_PROXY",
"http_proxy",
"HTTPS_PROXY",
"https_proxy",
"ALL_PROXY",
"all_proxy",
"NO_PROXY",
"no_proxy",
];
const TEST_CERT_1: &str = include_str!("fixtures/test-ca.pem");
const TEST_CERT_2: &str = include_str!("fixtures/test-intermediate.pem");
const TRUSTED_TEST_CERT: &str = include_str!("fixtures/test-ca-trusted.pem");
fn write_cert_file(temp_dir: &TempDir, name: &str, contents: &str) -> std::path::PathBuf {
struct Tls13Material {
ca_cert_pem: String,
server_cert: CertificateDer<'static>,
server_key: PrivateKeyDer<'static>,
}
struct Tls13TestServer {
ca_cert_pem: String,
request_rx: mpsc::Receiver<Result<String, String>>,
url: String,
}
struct PlainHttpOrigin {
request_rx: mpsc::Receiver<Result<String, String>>,
url: String,
}
struct TlsInterceptingProxy {
ca_cert_pem: String,
request_rx: mpsc::Receiver<Result<String, String>>,
url: String,
}
fn write_cert_file(temp_dir: &TempDir, name: &str, contents: &str) -> PathBuf {
let path = temp_dir.path().join(name);
fs::write(&path, contents).unwrap_or_else(|error| {
panic!("write cert fixture failed for {}: {error}", path.display())
@@ -29,7 +88,7 @@ fn write_cert_file(temp_dir: &TempDir, name: &str, contents: &str) -> std::path:
path
}
fn run_probe(envs: &[(&str, &Path)]) -> std::process::Output {
fn probe_command() -> Command {
let mut cmd = Command::new(
cargo_bin("custom_ca_probe")
.unwrap_or_else(|error| panic!("failed to locate custom_ca_probe: {error}")),
@@ -37,7 +96,18 @@ fn run_probe(envs: &[(&str, &Path)]) -> std::process::Output {
// `Command` inherits the parent environment by default, so scrub CA-related variables first or
// these tests can accidentally pass/fail based on the developer shell or CI runner.
cmd.env_remove(CODEX_CA_CERT_ENV);
cmd.env_remove(PROBE_PROXY_ENV);
cmd.env_remove(PROBE_TLS13_ENV);
cmd.env_remove(PROBE_URL_ENV);
cmd.env_remove(SSL_CERT_FILE_ENV);
for env_var in PROXY_ENV_VARS {
cmd.env_remove(env_var);
}
cmd
}
fn run_probe(envs: &[(&str, &Path)]) -> std::process::Output {
let mut cmd = probe_command();
for (key, value) in envs {
cmd.env(key, value);
}
@@ -45,6 +115,286 @@ fn run_probe(envs: &[(&str, &Path)]) -> std::process::Output {
.unwrap_or_else(|error| panic!("failed to run custom_ca_probe: {error}"))
}
fn run_probe_posting_to_tls13_server(envs: &[(&str, &Path)], url: &str) -> std::process::Output {
let mut cmd = probe_command();
for (key, value) in envs {
cmd.env(key, value);
}
cmd.env(PROBE_TLS13_ENV, "1");
cmd.env(PROBE_URL_ENV, url);
cmd.output()
.unwrap_or_else(|error| panic!("failed to run custom_ca_probe: {error}"))
}
fn run_probe_posting_through_tls_intercepting_proxy(
envs: &[(&str, &Path)],
url: &str,
proxy_url: &str,
) -> std::process::Output {
let mut cmd = probe_command();
for (key, value) in envs {
cmd.env(key, value);
}
cmd.env(PROBE_PROXY_ENV, proxy_url);
cmd.env(PROBE_TLS13_ENV, "1");
cmd.env(PROBE_URL_ENV, url);
cmd.output()
.unwrap_or_else(|error| panic!("failed to run custom_ca_probe: {error}"))
}
fn spawn_tls13_test_server() -> Tls13TestServer {
codex_utils_rustls_provider::ensure_rustls_crypto_provider();
let material = generate_tls13_material();
let listener = TcpListener::bind(("127.0.0.1", 0))
.unwrap_or_else(|error| panic!("bind TLS test server: {error}"));
listener
.set_nonblocking(true)
.unwrap_or_else(|error| panic!("set TLS test server nonblocking: {error}"));
let port = listener
.local_addr()
.unwrap_or_else(|error| panic!("TLS test server addr: {error}"))
.port();
let config = Arc::new(
rustls::ServerConfig::builder_with_protocol_versions(&[&rustls::version::TLS13])
.with_no_client_auth()
.with_single_cert(vec![material.server_cert], material.server_key)
.unwrap_or_else(|error| panic!("TLS 1.3 server config: {error}")),
);
let (request_tx, request_rx) = mpsc::channel();
thread::spawn(move || {
let result = accept_tls13_request(listener, config);
let _ = request_tx.send(result.map_err(|error| error.to_string()));
});
Tls13TestServer {
ca_cert_pem: material.ca_cert_pem,
request_rx,
url: format!("https://127.0.0.1:{port}/oauth/token"),
}
}
fn spawn_plain_http_origin() -> PlainHttpOrigin {
let listener = TcpListener::bind(("127.0.0.1", 0))
.unwrap_or_else(|error| panic!("bind plain HTTP origin: {error}"));
listener
.set_nonblocking(true)
.unwrap_or_else(|error| panic!("set plain HTTP origin nonblocking: {error}"));
let port = listener
.local_addr()
.unwrap_or_else(|error| panic!("plain HTTP origin addr: {error}"))
.port();
let (request_tx, request_rx) = mpsc::channel();
thread::spawn(move || {
let result = accept_plain_http_origin_request(listener);
let _ = request_tx.send(result.map_err(|error| error.to_string()));
});
PlainHttpOrigin {
request_rx,
url: format!("https://127.0.0.1:{port}/oauth/token"),
}
}
fn spawn_tls_intercepting_proxy() -> TlsInterceptingProxy {
codex_utils_rustls_provider::ensure_rustls_crypto_provider();
let material = generate_tls13_material();
let listener = TcpListener::bind(("127.0.0.1", 0))
.unwrap_or_else(|error| panic!("bind TLS intercepting proxy: {error}"));
listener
.set_nonblocking(true)
.unwrap_or_else(|error| panic!("set TLS intercepting proxy nonblocking: {error}"));
let port = listener
.local_addr()
.unwrap_or_else(|error| panic!("TLS intercepting proxy addr: {error}"))
.port();
let config = Arc::new(
rustls::ServerConfig::builder_with_protocol_versions(&[&rustls::version::TLS13])
.with_no_client_auth()
.with_single_cert(vec![material.server_cert], material.server_key)
.unwrap_or_else(|error| panic!("TLS intercepting proxy config: {error}")),
);
let (request_tx, request_rx) = mpsc::channel();
thread::spawn(move || {
let result = accept_tls_intercepting_proxy_request(listener, config);
let _ = request_tx.send(result.map_err(|error| error.to_string()));
});
TlsInterceptingProxy {
ca_cert_pem: material.ca_cert_pem,
request_rx,
url: format!("http://127.0.0.1:{port}"),
}
}
fn generate_tls13_material() -> Tls13Material {
let mut ca_params = CertificateParams::default();
ca_params.is_ca = IsCa::Ca(BasicConstraints::Unconstrained);
ca_params.key_usages = vec![KeyUsagePurpose::KeyCertSign, KeyUsagePurpose::CrlSign];
let mut ca_distinguished_name = DistinguishedName::new();
ca_distinguished_name.push(DnType::CommonName, "codex test CA");
ca_params.distinguished_name = ca_distinguished_name;
let ca_key_pair = KeyPair::generate_for(&PKCS_ECDSA_P256_SHA256)
.unwrap_or_else(|error| panic!("generate test CA key pair: {error}"));
let ca = CertifiedIssuer::self_signed(ca_params, ca_key_pair)
.unwrap_or_else(|error| panic!("generate test CA certificate: {error}"));
let mut server_params =
CertificateParams::new(vec!["localhost".to_string(), "127.0.0.1".to_string()])
.unwrap_or_else(|error| panic!("create test server certificate params: {error}"));
server_params.extended_key_usages = vec![ExtendedKeyUsagePurpose::ServerAuth];
server_params.key_usages = vec![
KeyUsagePurpose::DigitalSignature,
KeyUsagePurpose::KeyEncipherment,
];
let server_key_pair = KeyPair::generate_for(&PKCS_ECDSA_P256_SHA256)
.unwrap_or_else(|error| panic!("generate test server key pair: {error}"));
let server_cert = server_params
.signed_by(&server_key_pair, &ca)
.unwrap_or_else(|error| panic!("generate test server certificate: {error}"));
Tls13Material {
ca_cert_pem: ca.pem(),
server_cert: server_cert.der().clone(),
server_key: PrivateKeyDer::from(server_key_pair),
}
}
fn accept_plain_http_origin_request(listener: TcpListener) -> io::Result<String> {
let mut stream = accept_with_timeout(listener, Duration::from_secs(5))?;
stream.set_nonblocking(false)?;
stream.set_read_timeout(Some(Duration::from_secs(5)))?;
stream.set_write_timeout(Some(Duration::from_secs(5)))?;
let request = read_http_message(&mut stream)?;
stream.write_all(b"HTTP/1.1 200 OK\r\nContent-Length: 2\r\nConnection: close\r\n\r\nok")?;
stream.flush()?;
Ok(request)
}
fn accept_tls13_request(
listener: TcpListener,
config: Arc<rustls::ServerConfig>,
) -> io::Result<String> {
let stream = accept_with_timeout(listener, Duration::from_secs(5))?;
stream.set_nonblocking(false)?;
stream.set_read_timeout(Some(Duration::from_secs(5)))?;
stream.set_write_timeout(Some(Duration::from_secs(5)))?;
let connection = rustls::ServerConnection::new(config).map_err(io::Error::other)?;
let mut tls = rustls::StreamOwned::new(connection, stream);
let request = read_http_message(&mut tls)?;
tls.write_all(b"HTTP/1.1 200 OK\r\nContent-Length: 2\r\nConnection: close\r\n\r\nok")?;
tls.flush()?;
Ok(request)
}
fn accept_tls_intercepting_proxy_request(
listener: TcpListener,
config: Arc<rustls::ServerConfig>,
) -> io::Result<String> {
let mut stream = accept_with_timeout(listener, Duration::from_secs(5))?;
stream.set_nonblocking(false)?;
stream.set_read_timeout(Some(Duration::from_secs(5)))?;
stream.set_write_timeout(Some(Duration::from_secs(5)))?;
let connect_request = read_http_message(&mut stream)?;
let origin_authority = connect_authority_from_request(&connect_request)?;
stream.write_all(b"HTTP/1.1 200 Connection Established\r\n\r\n")?;
stream.flush()?;
let connection = rustls::ServerConnection::new(config).map_err(io::Error::other)?;
let mut tls = rustls::StreamOwned::new(connection, stream);
let request = read_http_message(&mut tls)?;
let mut origin = TcpStream::connect(origin_authority.as_str())?;
origin.set_read_timeout(Some(Duration::from_secs(5)))?;
origin.set_write_timeout(Some(Duration::from_secs(5)))?;
origin.write_all(request.as_bytes())?;
origin.flush()?;
let response = read_http_message(&mut origin)?;
tls.write_all(response.as_bytes())?;
tls.flush()?;
Ok(request)
}
fn connect_authority_from_request(request: &str) -> io::Result<String> {
let request_line = request
.lines()
.next()
.ok_or_else(|| io::Error::new(io::ErrorKind::InvalidData, "empty CONNECT request"))?;
let mut parts = request_line.split_whitespace();
match (parts.next(), parts.next(), parts.next()) {
(Some("CONNECT"), Some(authority), Some(_version)) => Ok(authority.to_string()),
_ => Err(io::Error::new(
io::ErrorKind::InvalidData,
format!("invalid CONNECT request line: {request_line}"),
)),
}
}
fn accept_with_timeout(listener: TcpListener, timeout: Duration) -> io::Result<TcpStream> {
let deadline = Instant::now() + timeout;
loop {
match listener.accept() {
Ok((stream, _)) => return Ok(stream),
Err(error) if error.kind() == io::ErrorKind::WouldBlock => {
if Instant::now() >= deadline {
return Err(io::Error::new(
io::ErrorKind::TimedOut,
"timed out waiting for TLS test client",
));
}
thread::sleep(Duration::from_millis(10));
}
Err(error) => return Err(error),
}
}
}
fn read_http_message(stream: &mut impl Read) -> io::Result<String> {
let mut buffer = Vec::new();
let mut chunk = [0; 1024];
loop {
let bytes_read = stream.read(&mut chunk)?;
if bytes_read == 0 {
break;
}
buffer.extend_from_slice(&chunk[..bytes_read]);
if let Some(header_end) = buffer.windows(4).position(|window| window == b"\r\n\r\n") {
let body_start = header_end + 4;
let headers = String::from_utf8_lossy(&buffer[..body_start]);
let content_length = headers
.lines()
.filter_map(|line| line.split_once(':'))
.find_map(|(name, value)| {
name.eq_ignore_ascii_case("content-length")
.then(|| value.trim().parse::<usize>().ok())
.flatten()
})
.unwrap_or(0);
if buffer.len() >= body_start + content_length {
break;
}
}
}
Ok(String::from_utf8_lossy(&buffer).into_owned())
}
fn assert_token_exchange_request(request: &str) {
assert!(
request.starts_with("POST /oauth/token HTTP/1.1"),
"unexpected request:\n{request}"
);
assert!(
request.contains("grant_type=authorization_code&code=test"),
"unexpected request body:\n{request}"
);
}
#[test]
fn uses_codex_ca_cert_env() {
let temp_dir = TempDir::new().expect("tempdir");
@@ -90,6 +440,59 @@ fn handles_multi_certificate_bundle() {
assert!(output.status.success());
}
#[test]
fn posts_to_tls13_server_using_custom_ca_bundle() {
let temp_dir = TempDir::new().expect("tempdir");
let server = spawn_tls13_test_server();
let cert_path = write_cert_file(&temp_dir, "tls-ca.pem", &server.ca_cert_pem);
let output =
run_probe_posting_to_tls13_server(&[(CODEX_CA_CERT_ENV, cert_path.as_path())], &server.url);
let server_result = server.request_rx.recv_timeout(Duration::from_secs(5));
assert!(
output.status.success(),
"custom_ca_probe failed\nstdout:\n{}\nstderr:\n{}\nserver:\n{server_result:?}",
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
let request = server_result
.expect("TLS test server should report a request")
.expect("TLS test server should accept the probe request");
assert_token_exchange_request(&request);
}
#[test]
fn posts_to_token_origin_through_tls_intercepting_proxy_with_custom_ca_bundle() {
let temp_dir = TempDir::new().expect("tempdir");
let origin = spawn_plain_http_origin();
let proxy = spawn_tls_intercepting_proxy();
let cert_path = write_cert_file(&temp_dir, "proxy-ca.pem", &proxy.ca_cert_pem);
let output = run_probe_posting_through_tls_intercepting_proxy(
&[(CODEX_CA_CERT_ENV, cert_path.as_path())],
&origin.url,
&proxy.url,
);
let proxy_result = proxy.request_rx.recv_timeout(Duration::from_secs(5));
let origin_result = origin.request_rx.recv_timeout(Duration::from_secs(5));
assert!(
output.status.success(),
"custom_ca_probe failed\nstdout:\n{}\nstderr:\n{}\nproxy:\n{proxy_result:?}\norigin:\n{origin_result:?}",
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
let proxy_request = proxy_result
.expect("TLS intercepting proxy should report a request")
.expect("TLS intercepting proxy should accept the probe request");
let origin_request = origin_result
.expect("plain HTTP origin should report a request")
.expect("plain HTTP origin should accept the forwarded request");
assert_token_exchange_request(&proxy_request);
assert_token_exchange_request(&origin_request);
}
#[test]
fn rejects_empty_pem_file_with_hint() {
let temp_dir = TempDir::new().expect("tempdir");

View File

@@ -97,7 +97,7 @@ use codex_protocol::protocol::TurnDiffEvent;
use codex_protocol::protocol::WarningEvent;
use codex_protocol::user_input::UserInput;
use codex_tools::ToolName;
use codex_tools::filter_tool_suggest_discoverable_tools_for_client;
use codex_tools::filter_request_plugin_install_discoverable_tools_for_client;
use codex_utils_stream_parser::AssistantTextChunk;
use codex_utils_stream_parser::AssistantTextStreamParser;
use codex_utils_stream_parser::ProposedPlanSegment;
@@ -1170,7 +1170,7 @@ pub(crate) async fn built_tools(
)
.await
.map(|discoverable_tools| {
filter_tool_suggest_discoverable_tools_for_client(
filter_request_plugin_install_discoverable_tools_for_client(
discoverable_tools,
turn_context.app_server_client_name.as_deref(),
)

View File

@@ -10,11 +10,11 @@ pub(crate) mod multi_agents_common;
pub(crate) mod multi_agents_v2;
mod plan;
mod request_permissions;
mod request_plugin_install;
mod request_user_input;
mod shell;
mod test_sync;
mod tool_search;
mod tool_suggest;
mod unavailable_tool;
pub(crate) mod unified_exec;
mod view_image;
@@ -43,12 +43,12 @@ pub use mcp::McpHandler;
pub use mcp_resource::McpResourceHandler;
pub use plan::PlanHandler;
pub use request_permissions::RequestPermissionsHandler;
pub use request_plugin_install::RequestPluginInstallHandler;
pub use request_user_input::RequestUserInputHandler;
pub use shell::ShellCommandHandler;
pub use shell::ShellHandler;
pub use test_sync::TestSyncHandler;
pub use tool_search::ToolSearchHandler;
pub use tool_suggest::ToolSuggestHandler;
pub use unavailable_tool::UnavailableToolHandler;
pub(crate) use unavailable_tool::unavailable_tool_message;
pub use unified_exec::UnifiedExecHandler;

View File

@@ -8,15 +8,15 @@ use codex_rmcp_client::ElicitationResponse;
use codex_tools::DiscoverableTool;
use codex_tools::DiscoverableToolAction;
use codex_tools::DiscoverableToolType;
use codex_tools::TOOL_SUGGEST_PERSIST_ALWAYS_VALUE;
use codex_tools::TOOL_SUGGEST_PERSIST_KEY;
use codex_tools::TOOL_SUGGEST_TOOL_NAME;
use codex_tools::ToolSuggestArgs;
use codex_tools::ToolSuggestResult;
use codex_tools::all_suggested_connectors_picked_up;
use codex_tools::build_tool_suggestion_elicitation_request;
use codex_tools::filter_tool_suggest_discoverable_tools_for_client;
use codex_tools::verified_connector_suggestion_completed;
use codex_tools::REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE;
use codex_tools::REQUEST_PLUGIN_INSTALL_PERSIST_KEY;
use codex_tools::REQUEST_PLUGIN_INSTALL_TOOL_NAME;
use codex_tools::RequestPluginInstallArgs;
use codex_tools::RequestPluginInstallResult;
use codex_tools::all_requested_connectors_picked_up;
use codex_tools::build_request_plugin_install_elicitation_request;
use codex_tools::filter_request_plugin_install_discoverable_tools_for_client;
use codex_tools::verified_connector_install_completed;
use rmcp::model::RequestId;
use serde_json::Value;
use tracing::warn;
@@ -32,9 +32,9 @@ use crate::tools::handlers::parse_arguments;
use crate::tools::registry::ToolHandler;
use crate::tools::registry::ToolKind;
pub struct ToolSuggestHandler;
pub struct RequestPluginInstallHandler;
impl ToolHandler for ToolSuggestHandler {
impl ToolHandler for RequestPluginInstallHandler {
type Output = FunctionToolOutput;
fn kind(&self) -> ToolKind {
@@ -43,7 +43,7 @@ impl ToolHandler for ToolSuggestHandler {
#[expect(
clippy::await_holding_invalid_type,
reason = "tool suggestion discovery reads through the session-owned manager guard"
reason = "plugin install discovery reads through the session-owned manager guard"
)]
async fn handle(&self, invocation: ToolInvocation) -> Result<Self::Output, FunctionCallError> {
let ToolInvocation {
@@ -58,12 +58,12 @@ impl ToolHandler for ToolSuggestHandler {
ToolPayload::Function { arguments } => arguments,
_ => {
return Err(FunctionCallError::Fatal(format!(
"{TOOL_SUGGEST_TOOL_NAME} handler received unsupported payload"
"{REQUEST_PLUGIN_INSTALL_TOOL_NAME} handler received unsupported payload"
)));
}
};
let args: ToolSuggestArgs = parse_arguments(&arguments)?;
let args: RequestPluginInstallArgs = parse_arguments(&arguments)?;
let suggest_reason = args.suggest_reason.trim();
if suggest_reason.is_empty() {
return Err(FunctionCallError::RespondToModel(
@@ -72,14 +72,15 @@ impl ToolHandler for ToolSuggestHandler {
}
if args.action_type != DiscoverableToolAction::Install {
return Err(FunctionCallError::RespondToModel(
"tool suggestions currently support only action_type=\"install\"".to_string(),
"plugin install requests currently support only action_type=\"install\""
.to_string(),
));
}
if args.tool_type == DiscoverableToolType::Plugin
&& turn.app_server_client_name.as_deref() == Some("codex-tui")
{
return Err(FunctionCallError::RespondToModel(
"plugin tool suggestions are not available in codex-tui yet".to_string(),
"plugin install requests are not available in codex-tui yet".to_string(),
));
}
@@ -98,14 +99,14 @@ impl ToolHandler for ToolSuggestHandler {
)
.await
.map(|discoverable_tools| {
filter_tool_suggest_discoverable_tools_for_client(
filter_request_plugin_install_discoverable_tools_for_client(
discoverable_tools,
turn.app_server_client_name.as_deref(),
)
})
.map_err(|err| {
FunctionCallError::RespondToModel(format!(
"tool suggestions are unavailable right now: {err}"
"plugin install requests are unavailable right now: {err}"
))
})?;
@@ -114,12 +115,12 @@ impl ToolHandler for ToolSuggestHandler {
.find(|tool| tool.tool_type() == args.tool_type && tool.id() == args.tool_id)
.ok_or_else(|| {
FunctionCallError::RespondToModel(format!(
"tool_id must match one of the discoverable tools exposed by {TOOL_SUGGEST_TOOL_NAME}"
"tool_id must match one of the discoverable tools exposed by {REQUEST_PLUGIN_INSTALL_TOOL_NAME}"
))
})?;
let request_id = RequestId::String(format!("tool_suggestion_{call_id}").into());
let params = build_tool_suggestion_elicitation_request(
let request_id = RequestId::String(format!("request_plugin_install_{call_id}").into());
let params = build_request_plugin_install_elicitation_request(
CODEX_APPS_MCP_SERVER_NAME,
session.conversation_id.to_string(),
turn.sub_id.clone(),
@@ -131,14 +132,14 @@ impl ToolHandler for ToolSuggestHandler {
.request_mcp_server_elicitation(turn.as_ref(), request_id, params)
.await;
if let Some(response) = response.as_ref() {
maybe_persist_tool_suggest_disable(&session, &turn, &tool, response).await;
maybe_persist_disabled_install_request(&session, &turn, &tool, response).await;
}
let user_confirmed = response
.as_ref()
.is_some_and(|response| response.action == ElicitationAction::Accept);
let completed = if user_confirmed {
verify_tool_suggestion_completed(&session, &turn, &tool, auth.as_ref()).await
verify_request_plugin_install_completed(&session, &turn, &tool, auth.as_ref()).await
} else {
false
};
@@ -149,7 +150,7 @@ impl ToolHandler for ToolSuggestHandler {
.await;
}
let content = serde_json::to_string(&ToolSuggestResult {
let content = serde_json::to_string(&RequestPluginInstallResult {
completed,
user_confirmed,
tool_type: args.tool_type,
@@ -160,7 +161,7 @@ impl ToolHandler for ToolSuggestHandler {
})
.map_err(|err| {
FunctionCallError::Fatal(format!(
"failed to serialize {TOOL_SUGGEST_TOOL_NAME} response: {err}"
"failed to serialize {REQUEST_PLUGIN_INSTALL_TOOL_NAME} response: {err}"
))
})?;
@@ -168,17 +169,17 @@ impl ToolHandler for ToolSuggestHandler {
}
}
async fn maybe_persist_tool_suggest_disable(
async fn maybe_persist_disabled_install_request(
session: &crate::session::session::Session,
turn: &crate::session::turn_context::TurnContext,
tool: &DiscoverableTool,
response: &ElicitationResponse,
) {
if !tool_suggest_response_requests_persistent_disable(response) {
if !request_plugin_install_response_requests_persistent_disable(response) {
return;
}
if let Err(err) = persist_tool_suggest_disable(&turn.config.codex_home, tool).await {
if let Err(err) = persist_disabled_install_request(&turn.config.codex_home, tool).await {
warn!(
error = %err,
tool_id = tool.id(),
@@ -190,7 +191,9 @@ async fn maybe_persist_tool_suggest_disable(
session.reload_user_config_layer().await;
}
fn tool_suggest_response_requests_persistent_disable(response: &ElicitationResponse) -> bool {
fn request_plugin_install_response_requests_persistent_disable(
response: &ElicitationResponse,
) -> bool {
if response.action != ElicitationAction::Decline {
return false;
}
@@ -199,24 +202,24 @@ fn tool_suggest_response_requests_persistent_disable(response: &ElicitationRespo
.meta
.as_ref()
.and_then(Value::as_object)
.and_then(|meta| meta.get(TOOL_SUGGEST_PERSIST_KEY))
.and_then(|meta| meta.get(REQUEST_PLUGIN_INSTALL_PERSIST_KEY))
.and_then(Value::as_str)
== Some(TOOL_SUGGEST_PERSIST_ALWAYS_VALUE)
== Some(REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE)
}
async fn persist_tool_suggest_disable(
async fn persist_disabled_install_request(
codex_home: &codex_utils_absolute_path::AbsolutePathBuf,
tool: &DiscoverableTool,
) -> anyhow::Result<()> {
ConfigEditsBuilder::new(codex_home)
.with_edits([ConfigEdit::AddToolSuggestDisabledTool(
disabled_tool_suggestion(tool),
disabled_install_request(tool),
)])
.apply()
.await
}
fn disabled_tool_suggestion(tool: &DiscoverableTool) -> ToolSuggestDisabledTool {
fn disabled_install_request(tool: &DiscoverableTool) -> ToolSuggestDisabledTool {
match tool {
DiscoverableTool::Connector(connector) => {
ToolSuggestDisabledTool::connector(connector.id.as_str())
@@ -225,14 +228,14 @@ fn disabled_tool_suggestion(tool: &DiscoverableTool) -> ToolSuggestDisabledTool
}
}
async fn verify_tool_suggestion_completed(
async fn verify_request_plugin_install_completed(
session: &crate::session::session::Session,
turn: &crate::session::turn_context::TurnContext,
tool: &DiscoverableTool,
auth: Option<&codex_login::CodexAuth>,
) -> bool {
match tool {
DiscoverableTool::Connector(connector) => refresh_missing_suggested_connectors(
DiscoverableTool::Connector(connector) => refresh_missing_requested_connectors(
session,
turn,
auth,
@@ -241,17 +244,17 @@ async fn verify_tool_suggestion_completed(
)
.await
.is_some_and(|accessible_connectors| {
verified_connector_suggestion_completed(connector.id.as_str(), &accessible_connectors)
verified_connector_install_completed(connector.id.as_str(), &accessible_connectors)
}),
DiscoverableTool::Plugin(plugin) => {
session.reload_user_config_layer().await;
let config = session.get_config().await;
let completed = verified_plugin_suggestion_completed(
let completed = verified_plugin_install_completed(
plugin.id.as_str(),
config.as_ref(),
session.services.plugins_manager.as_ref(),
);
let _ = refresh_missing_suggested_connectors(
let _ = refresh_missing_requested_connectors(
session,
turn,
auth,
@@ -268,7 +271,7 @@ async fn verify_tool_suggestion_completed(
clippy::await_holding_invalid_type,
reason = "connector cache refresh reads through the session-owned manager guard"
)]
async fn refresh_missing_suggested_connectors(
async fn refresh_missing_requested_connectors(
session: &crate::session::session::Session,
turn: &crate::session::turn_context::TurnContext,
auth: Option<&codex_login::CodexAuth>,
@@ -285,7 +288,7 @@ async fn refresh_missing_suggested_connectors(
connectors::accessible_connectors_from_mcp_tools(&mcp_tools),
&turn.config,
);
if all_suggested_connectors_picked_up(expected_connector_ids, &accessible_connectors) {
if all_requested_connectors_picked_up(expected_connector_ids, &accessible_connectors) {
return Some(accessible_connectors);
}
@@ -304,14 +307,14 @@ async fn refresh_missing_suggested_connectors(
}
Err(err) => {
warn!(
"failed to refresh codex apps tools cache after tool suggestion for {tool_id}: {err:#}"
"failed to refresh codex apps tools cache after plugin install request for {tool_id}: {err:#}"
);
None
}
}
}
fn verified_plugin_suggestion_completed(
fn verified_plugin_install_completed(
tool_id: &str,
config: &crate::config::Config,
plugins_manager: &codex_core_plugins::PluginsManager,
@@ -327,5 +330,5 @@ fn verified_plugin_suggestion_completed(
}
#[cfg(test)]
#[path = "tool_suggest_tests.rs"]
#[path = "request_plugin_install_tests.rs"]
mod tests;

View File

@@ -22,7 +22,7 @@ use serde_json::json;
use tempfile::tempdir;
#[tokio::test]
async fn verified_plugin_suggestion_completed_requires_installed_plugin() {
async fn verified_plugin_install_completed_requires_installed_plugin() {
let codex_home = tempdir().expect("tempdir should succeed");
let curated_root = curated_plugins_repo_path(codex_home.path());
write_openai_curated_marketplace(&curated_root, &["sample"]);
@@ -32,7 +32,7 @@ async fn verified_plugin_suggestion_completed_requires_installed_plugin() {
let config = load_plugins_config(codex_home.path()).await;
let plugins_manager = PluginsManager::new(codex_home.path().to_path_buf());
assert!(!verified_plugin_suggestion_completed(
assert!(!verified_plugin_install_completed(
"sample@openai-curated",
&config,
&plugins_manager,
@@ -50,7 +50,7 @@ async fn verified_plugin_suggestion_completed_requires_installed_plugin() {
.expect("plugin should install");
let refreshed_config = load_plugins_config(codex_home.path()).await;
assert!(verified_plugin_suggestion_completed(
assert!(verified_plugin_install_completed(
"sample@openai-curated",
&refreshed_config,
&plugins_manager,
@@ -58,43 +58,47 @@ async fn verified_plugin_suggestion_completed_requires_installed_plugin() {
}
#[test]
fn tool_suggest_response_persists_only_decline_always_mode() {
assert!(tool_suggest_response_requests_persistent_disable(
fn request_plugin_install_response_persists_only_decline_always_mode() {
assert!(request_plugin_install_response_requests_persistent_disable(
&ElicitationResponse {
action: ElicitationAction::Decline,
content: None,
meta: Some(json!({ TOOL_SUGGEST_PERSIST_KEY: TOOL_SUGGEST_PERSIST_ALWAYS_VALUE })),
meta: Some(json!({
REQUEST_PLUGIN_INSTALL_PERSIST_KEY: REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE
})),
}
));
assert!(!tool_suggest_response_requests_persistent_disable(
&ElicitationResponse {
assert!(
!request_plugin_install_response_requests_persistent_disable(&ElicitationResponse {
action: ElicitationAction::Accept,
content: None,
meta: Some(json!({ TOOL_SUGGEST_PERSIST_KEY: TOOL_SUGGEST_PERSIST_ALWAYS_VALUE })),
}
));
assert!(!tool_suggest_response_requests_persistent_disable(
&ElicitationResponse {
meta: Some(json!({
REQUEST_PLUGIN_INSTALL_PERSIST_KEY: REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE
})),
})
);
assert!(
!request_plugin_install_response_requests_persistent_disable(&ElicitationResponse {
action: ElicitationAction::Decline,
content: None,
meta: Some(json!({ TOOL_SUGGEST_PERSIST_KEY: "session" })),
}
));
assert!(!tool_suggest_response_requests_persistent_disable(
&ElicitationResponse {
meta: Some(json!({ REQUEST_PLUGIN_INSTALL_PERSIST_KEY: "session" })),
})
);
assert!(
!request_plugin_install_response_requests_persistent_disable(&ElicitationResponse {
action: ElicitationAction::Decline,
content: None,
meta: None,
}
));
})
);
}
#[tokio::test]
async fn persist_tool_suggest_disable_writes_connector_config() {
async fn persist_disabled_install_request_writes_connector_config() {
let codex_home = tempdir().expect("tempdir should succeed");
let tool = connector_tool("connector_calendar", "Google Calendar");
persist_tool_suggest_disable(&codex_home.path().abs(), &tool)
persist_disabled_install_request(&codex_home.path().abs(), &tool)
.await
.expect("persist connector disable");
@@ -111,7 +115,7 @@ async fn persist_tool_suggest_disable_writes_connector_config() {
}
#[tokio::test]
async fn persist_tool_suggest_disable_writes_plugin_config() {
async fn persist_disabled_install_request_writes_plugin_config() {
let codex_home = tempdir().expect("tempdir should succeed");
let tool = DiscoverableTool::Plugin(Box::new(DiscoverablePluginInfo {
id: "slack@openai-curated".to_string(),
@@ -122,7 +126,7 @@ async fn persist_tool_suggest_disable_writes_plugin_config() {
app_connector_ids: Vec::new(),
}));
persist_tool_suggest_disable(&codex_home.path().abs(), &tool)
persist_disabled_install_request(&codex_home.path().abs(), &tool)
.await
.expect("persist plugin disable");
@@ -139,7 +143,7 @@ async fn persist_tool_suggest_disable_writes_plugin_config() {
}
#[tokio::test]
async fn persist_tool_suggest_disable_dedupes_existing_disabled_tools() {
async fn persist_disabled_install_request_dedupes_existing_disabled_tools() {
let codex_home = tempdir().expect("tempdir should succeed");
let tool = connector_tool("connector_calendar", "Google Calendar");
std::fs::write(
@@ -169,7 +173,7 @@ id = "slack@openai-curated"
)
.expect("write config");
persist_tool_suggest_disable(&codex_home.path().abs(), &tool)
persist_disabled_install_request(&codex_home.path().abs(), &tool)
.await
.expect("persist connector disable");

View File

@@ -86,12 +86,12 @@ pub(crate) fn build_specs_with_discoverable_tools(
use crate::tools::handlers::McpResourceHandler;
use crate::tools::handlers::PlanHandler;
use crate::tools::handlers::RequestPermissionsHandler;
use crate::tools::handlers::RequestPluginInstallHandler;
use crate::tools::handlers::RequestUserInputHandler;
use crate::tools::handlers::ShellCommandHandler;
use crate::tools::handlers::ShellHandler;
use crate::tools::handlers::TestSyncHandler;
use crate::tools::handlers::ToolSearchHandler;
use crate::tools::handlers::ToolSuggestHandler;
use crate::tools::handlers::UnavailableToolHandler;
use crate::tools::handlers::UnifiedExecHandler;
use crate::tools::handlers::ViewImageHandler;
@@ -174,7 +174,7 @@ pub(crate) fn build_specs_with_discoverable_tools(
.cloned()
.collect::<Vec<_>>();
let mut tool_search_handler = None;
let tool_suggest_handler = Arc::new(ToolSuggestHandler);
let request_plugin_install_handler = Arc::new(RequestPluginInstallHandler);
let code_mode_handler = Arc::new(CodeModeExecuteHandler);
let code_mode_wait_handler = Arc::new(CodeModeWaitHandler);
let unavailable_tool_handler = Arc::new(UnavailableToolHandler);
@@ -281,8 +281,8 @@ pub(crate) fn build_specs_with_discoverable_tools(
builder.register_handler(handler.name, tool_search_handler.clone());
}
}
ToolHandlerKind::ToolSuggest => {
builder.register_handler(handler.name, tool_suggest_handler.clone());
ToolHandlerKind::RequestPluginInstall => {
builder.register_handler(handler.name, request_plugin_install_handler.clone());
}
ToolHandlerKind::UnifiedExec => {
builder.register_handler(handler.name, unified_exec_handler.clone());

View File

@@ -21,11 +21,11 @@ use codex_tools::ConfiguredToolSpec;
use codex_tools::DiscoverableTool;
use codex_tools::JsonSchema;
use codex_tools::LoadableToolSpec;
use codex_tools::REQUEST_PLUGIN_INSTALL_TOOL_NAME;
use codex_tools::ResponsesApiNamespaceTool;
use codex_tools::ResponsesApiTool;
use codex_tools::ShellCommandBackendConfig;
use codex_tools::TOOL_SEARCH_TOOL_NAME;
use codex_tools::TOOL_SUGGEST_TOOL_NAME;
use codex_tools::ToolName;
use codex_tools::ToolSpec;
use codex_tools::ToolsConfig;
@@ -791,7 +791,7 @@ async fn multi_agent_v2_wait_agent_schema_uses_configured_min_timeout() {
}
#[tokio::test]
async fn tool_suggest_requires_apps_and_plugins_features() {
async fn request_plugin_install_requires_apps_and_plugins_features() {
let model_info = search_capable_model_info().await;
let discoverable_tools = Some(vec![discoverable_connector(
"connector_2128aebfecb84f64a069897515042a44",
@@ -831,7 +831,7 @@ async fn tool_suggest_requires_apps_and_plugins_features() {
assert!(
!tools
.iter()
.any(|tool| tool.name() == TOOL_SUGGEST_TOOL_NAME),
.any(|tool| tool.name() == REQUEST_PLUGIN_INSTALL_TOOL_NAME),
"tool_suggest should be absent when {disabled_feature:?} is disabled"
);
}

View File

@@ -0,0 +1,29 @@
# Request plugin/connector install
Use this tool only to ask the user to install one known plugin or connector from the list below. The list contains known candidates that are not currently installed.
Use this ONLY when all of the following are true:
- The user explicitly asks to use a specific plugin or connector that is not already available in the current context or active `tools` list.
- `tool_search` is not available, or it has already been called and did not find or make the requested tool callable.
- The plugin or connector is one of the known installable plugins or connectors listed below. Only ask to install plugins or connectors from this list.
Do not use this tool for adjacent capabilities, broad recommendations, or tools that merely seem useful. Only use when the user explicitly asks to use that exact listed plugin or connector.
Known plugins/connectors available to install:
{{discoverable_tools}}
Workflow:
1. Check the current context and active `tools` list first. If current active tools aren't relevant and `tool_search` is available, only call this tool after `tool_search` has already been tried and found no relevant tool.
2. Match the user's explicit request against the known plugin/connector list above. Only proceed when one listed plugin or connector exactly fits.
3. If we found both connectors and plugins to install, use plugins first, only use connectors if the corresponding plugin is installed but the connector is not.
4. If one plugin or connector clearly fits, call `request_plugin_install` with:
- `tool_type`: `connector` or `plugin`
- `action_type`: `install`
- `tool_id`: exact id from the known plugin/connector list above
- `suggest_reason`: concise one-line user-facing reason this plugin or connector can help with the current request
5. After the request flow completes:
- if the user finished the install flow, continue by searching again or using the newly available plugin or connector
- if the user did not finish, continue without that plugin or connector, and don't request it again unless the user explicitly asks for it.
IMPORTANT: DO NOT call this tool in parallel with other tools.

View File

@@ -1,29 +0,0 @@
# Tool suggestion discovery
Use this tool only to ask the user to install one known plugin or connector from the list below. The list contains known candidates that are not currently installed.
Use this ONLY when all of the following are true:
- The user explicitly wants a specific plugin or connector that is not already available in the current context or active `tools` list.
- `tool_search` is not available, or it has already been called and did not find or make the requested tool callable.
- The tool is one of the known installable plugins or connectors listed below. Only ask to install tools from this list.
Do not use tool suggestion for adjacent capabilities, broad recommendations, or tools that merely seem useful. The user's intent must clearly match one listed tool.
Known plugins/connectors available to install:
{{discoverable_tools}}
Workflow:
1. Check the current context and active `tools` list first. If `tool_search` is available, call `tool_search` before calling `tool_suggest`. Do not use tool suggestion if the needed tool is already available, found through `tool_search`, or callable after discovery.
2. Match the user's explicit request against the known plugin/connector list above. Only proceed when one listed plugin or connector exactly fits.
3. If we found both connectors and plugins to suggest, use plugins first, only use connectors if the corresponding plugin is installed but the connector is not.
4. If one tool clearly fits, call `tool_suggest` with:
- `tool_type`: `connector` or `plugin`
- `action_type`: `install`
- `tool_id`: exact id from the known plugin/connector list above
- `suggest_reason`: concise one-line user-facing reason this tool can help with the current request
5. After the suggestion flow completes:
- if the user finished the install flow, continue by searching again or using the newly available tool
- if the user did not finish, continue without that tool, and don't suggest that tool again unless the user explicitly asks for it.
IMPORTANT: DO NOT call this tool in parallel with other tools.

View File

@@ -77,6 +77,7 @@ mod request_compression;
mod request_permissions;
#[cfg(not(target_os = "windows"))]
mod request_permissions_tool;
mod request_plugin_install;
mod request_user_input;
mod responses_api_proxy_headers;
mod resume;
@@ -98,7 +99,6 @@ mod stream_no_completed;
mod subagent_notifications;
mod tool_harness;
mod tool_parallelism;
mod tool_suggest;
mod tools;
mod truncation;
mod turn_state;

View File

@@ -22,7 +22,7 @@ use core_test_support::test_codex::test_codex;
use serde_json::Value;
const TOOL_SEARCH_TOOL_NAME: &str = "tool_search";
const TOOL_SUGGEST_TOOL_NAME: &str = "tool_suggest";
const REQUEST_PLUGIN_INSTALL_TOOL_NAME: &str = "request_plugin_install";
const DISCOVERABLE_GMAIL_ID: &str = "connector_68df038e0ba48191908c8434991bbac2";
fn tool_names(body: &Value) -> Vec<String> {
@@ -89,7 +89,8 @@ fn configure_apps_without_search_tool(config: &mut Config, apps_base_url: &str)
}
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn tool_suggest_is_available_without_search_tool_after_discovery_attempts() -> Result<()> {
async fn request_plugin_install_is_available_without_search_tool_after_discovery_attempts()
-> Result<()> {
skip_if_no_network!(Ok(()));
let server = start_mock_server().await;
@@ -125,18 +126,23 @@ async fn tool_suggest_is_available_without_search_tool_after_discovery_attempts(
"tools list should not include {TOOL_SEARCH_TOOL_NAME}: {tools:?}"
);
assert!(
tools.iter().any(|name| name == TOOL_SUGGEST_TOOL_NAME),
"tools list should include {TOOL_SUGGEST_TOOL_NAME}: {tools:?}"
tools
.iter()
.any(|name| name == REQUEST_PLUGIN_INSTALL_TOOL_NAME),
"tools list should include {REQUEST_PLUGIN_INSTALL_TOOL_NAME}: {tools:?}"
);
let description =
function_tool_description(&body, TOOL_SUGGEST_TOOL_NAME).expect("description");
function_tool_description(&body, REQUEST_PLUGIN_INSTALL_TOOL_NAME).expect("description");
assert!(description.contains(
"Use this tool only to ask the user to install one known plugin or connector from the list below"
));
assert!(description.contains(
"`tool_search` is not available, or it has already been called and did not find or make the requested tool callable."
));
assert!(description.contains(
"Only use when the user explicitly asks to use that exact listed plugin or connector."
));
assert!(description.contains("IMPORTANT: DO NOT call this tool in parallel with other tools."));
assert!(!description.contains("tool_search fails to find a good match"));

View File

@@ -27,7 +27,7 @@ schema and Responses API tool primitives that no longer need to live in
- collaboration and agent-job `ToolSpec` builders for spawn/send/wait/close,
`request_user_input`, and CSV fanout/reporting
- discoverable-tool models, client filtering, and `ToolSpec` builders for
`tool_search` and `tool_suggest`
`tool_search` and `request_plugin_install`
- `parse_tool_input_schema()`
- `parse_dynamic_tool()`
- `parse_mcp_tool()`

View File

@@ -13,6 +13,7 @@ mod local_tool;
mod mcp_resource_tool;
mod mcp_tool;
mod plan_tool;
mod request_plugin_install;
mod request_user_input_tool;
mod responses_api;
mod tool_config;
@@ -21,7 +22,6 @@ mod tool_discovery;
mod tool_registry_plan;
mod tool_registry_plan_types;
mod tool_spec;
mod tool_suggest;
mod utility_tool;
mod view_image;
@@ -80,6 +80,15 @@ pub use mcp_resource_tool::create_read_mcp_resource_tool;
pub use mcp_tool::mcp_call_tool_result_output_schema;
pub use mcp_tool::parse_mcp_tool;
pub use plan_tool::create_update_plan_tool;
pub use request_plugin_install::REQUEST_PLUGIN_INSTALL_APPROVAL_KIND_VALUE;
pub use request_plugin_install::REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE;
pub use request_plugin_install::REQUEST_PLUGIN_INSTALL_PERSIST_KEY;
pub use request_plugin_install::RequestPluginInstallArgs;
pub use request_plugin_install::RequestPluginInstallMeta;
pub use request_plugin_install::RequestPluginInstallResult;
pub use request_plugin_install::all_requested_connectors_picked_up;
pub use request_plugin_install::build_request_plugin_install_elicitation_request;
pub use request_plugin_install::verified_connector_install_completed;
pub use request_user_input_tool::REQUEST_USER_INPUT_TOOL_NAME;
pub use request_user_input_tool::create_request_user_input_tool;
pub use request_user_input_tool::normalize_request_user_input_args;
@@ -110,18 +119,18 @@ pub use tool_discovery::DiscoverablePluginInfo;
pub use tool_discovery::DiscoverableTool;
pub use tool_discovery::DiscoverableToolAction;
pub use tool_discovery::DiscoverableToolType;
pub use tool_discovery::REQUEST_PLUGIN_INSTALL_TOOL_NAME;
pub use tool_discovery::RequestPluginInstallEntry;
pub use tool_discovery::TOOL_SEARCH_DEFAULT_LIMIT;
pub use tool_discovery::TOOL_SEARCH_TOOL_NAME;
pub use tool_discovery::TOOL_SUGGEST_TOOL_NAME;
pub use tool_discovery::ToolSearchResultSource;
pub use tool_discovery::ToolSearchSource;
pub use tool_discovery::ToolSearchSourceInfo;
pub use tool_discovery::ToolSuggestEntry;
pub use tool_discovery::collect_request_plugin_install_entries;
pub use tool_discovery::collect_tool_search_source_infos;
pub use tool_discovery::collect_tool_suggest_entries;
pub use tool_discovery::create_request_plugin_install_tool;
pub use tool_discovery::create_tool_search_tool;
pub use tool_discovery::create_tool_suggest_tool;
pub use tool_discovery::filter_tool_suggest_discoverable_tools_for_client;
pub use tool_discovery::filter_request_plugin_install_discoverable_tools_for_client;
pub use tool_discovery::tool_search_result_source_to_loadable_tool_spec;
pub use tool_registry_plan::build_tool_registry_plan;
pub use tool_registry_plan_types::ToolHandlerKind;
@@ -140,15 +149,6 @@ pub use tool_spec::create_image_generation_tool;
pub use tool_spec::create_local_shell_tool;
pub use tool_spec::create_tools_json_for_responses_api;
pub use tool_spec::create_web_search_tool;
pub use tool_suggest::TOOL_SUGGEST_APPROVAL_KIND_VALUE;
pub use tool_suggest::TOOL_SUGGEST_PERSIST_ALWAYS_VALUE;
pub use tool_suggest::TOOL_SUGGEST_PERSIST_KEY;
pub use tool_suggest::ToolSuggestArgs;
pub use tool_suggest::ToolSuggestMeta;
pub use tool_suggest::ToolSuggestResult;
pub use tool_suggest::all_suggested_connectors_picked_up;
pub use tool_suggest::build_tool_suggestion_elicitation_request;
pub use tool_suggest::verified_connector_suggestion_completed;
pub use utility_tool::create_list_dir_tool;
pub use utility_tool::create_test_sync_tool;
pub use view_image::ViewImageToolOptions;

View File

@@ -13,12 +13,12 @@ use crate::DiscoverableTool;
use crate::DiscoverableToolAction;
use crate::DiscoverableToolType;
pub const TOOL_SUGGEST_APPROVAL_KIND_VALUE: &str = "tool_suggestion";
pub const TOOL_SUGGEST_PERSIST_KEY: &str = "persist";
pub const TOOL_SUGGEST_PERSIST_ALWAYS_VALUE: &str = "always";
pub const REQUEST_PLUGIN_INSTALL_APPROVAL_KIND_VALUE: &str = "tool_suggestion";
pub const REQUEST_PLUGIN_INSTALL_PERSIST_KEY: &str = "persist";
pub const REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE: &str = "always";
#[derive(Debug, Deserialize)]
pub struct ToolSuggestArgs {
pub struct RequestPluginInstallArgs {
pub tool_type: DiscoverableToolType,
pub action_type: DiscoverableToolAction,
pub tool_id: String,
@@ -26,7 +26,7 @@ pub struct ToolSuggestArgs {
}
#[derive(Debug, Serialize, PartialEq, Eq)]
pub struct ToolSuggestResult {
pub struct RequestPluginInstallResult {
pub completed: bool,
pub user_confirmed: bool,
pub tool_type: DiscoverableToolType,
@@ -37,7 +37,7 @@ pub struct ToolSuggestResult {
}
#[derive(Debug, Serialize, PartialEq, Eq)]
pub struct ToolSuggestMeta<'a> {
pub struct RequestPluginInstallMeta<'a> {
pub codex_approval_kind: &'static str,
pub persist: &'static str,
pub tool_type: DiscoverableToolType,
@@ -49,11 +49,11 @@ pub struct ToolSuggestMeta<'a> {
pub install_url: Option<&'a str>,
}
pub fn build_tool_suggestion_elicitation_request(
pub fn build_request_plugin_install_elicitation_request(
server_name: &str,
thread_id: String,
turn_id: String,
args: &ToolSuggestArgs,
args: &RequestPluginInstallArgs,
suggest_reason: &str,
tool: &DiscoverableTool,
) -> McpServerElicitationRequestParams {
@@ -66,7 +66,7 @@ pub fn build_tool_suggestion_elicitation_request(
turn_id: Some(turn_id),
server_name: server_name.to_string(),
request: McpServerElicitationRequest::Form {
meta: Some(json!(build_tool_suggestion_meta(
meta: Some(json!(build_request_plugin_install_meta(
args.tool_type,
args.action_type,
suggest_reason,
@@ -85,16 +85,16 @@ pub fn build_tool_suggestion_elicitation_request(
}
}
pub fn all_suggested_connectors_picked_up(
pub fn all_requested_connectors_picked_up(
expected_connector_ids: &[String],
accessible_connectors: &[AppInfo],
) -> bool {
expected_connector_ids.iter().all(|connector_id| {
verified_connector_suggestion_completed(connector_id, accessible_connectors)
verified_connector_install_completed(connector_id, accessible_connectors)
})
}
pub fn verified_connector_suggestion_completed(
pub fn verified_connector_install_completed(
tool_id: &str,
accessible_connectors: &[AppInfo],
) -> bool {
@@ -104,17 +104,17 @@ pub fn verified_connector_suggestion_completed(
.is_some_and(|connector| connector.is_accessible)
}
fn build_tool_suggestion_meta<'a>(
fn build_request_plugin_install_meta<'a>(
tool_type: DiscoverableToolType,
action_type: DiscoverableToolAction,
suggest_reason: &'a str,
tool_id: &'a str,
tool_name: &'a str,
install_url: Option<&'a str>,
) -> ToolSuggestMeta<'a> {
ToolSuggestMeta {
codex_approval_kind: TOOL_SUGGEST_APPROVAL_KIND_VALUE,
persist: TOOL_SUGGEST_PERSIST_ALWAYS_VALUE,
) -> RequestPluginInstallMeta<'a> {
RequestPluginInstallMeta {
codex_approval_kind: REQUEST_PLUGIN_INSTALL_APPROVAL_KIND_VALUE,
persist: REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE,
tool_type,
suggest_type: action_type,
suggest_reason,
@@ -125,5 +125,5 @@ fn build_tool_suggestion_meta<'a>(
}
#[cfg(test)]
#[path = "tool_suggest_tests.rs"]
#[path = "request_plugin_install_tests.rs"]
mod tests;

View File

@@ -4,8 +4,8 @@ use pretty_assertions::assert_eq;
use serde_json::json;
#[test]
fn build_tool_suggestion_elicitation_request_uses_expected_shape() {
let args = ToolSuggestArgs {
fn build_request_plugin_install_elicitation_request_uses_expected_shape() {
let args = RequestPluginInstallArgs {
tool_type: DiscoverableToolType::Connector,
action_type: DiscoverableToolAction::Install,
tool_id: "connector_2128aebfecb84f64a069897515042a44".to_string(),
@@ -30,7 +30,7 @@ fn build_tool_suggestion_elicitation_request_uses_expected_shape() {
plugin_display_names: Vec::new(),
}));
let request = build_tool_suggestion_elicitation_request(
let request = build_request_plugin_install_elicitation_request(
"codex-apps",
"thread-1".to_string(),
"turn-1".to_string(),
@@ -46,9 +46,9 @@ fn build_tool_suggestion_elicitation_request_uses_expected_shape() {
turn_id: Some("turn-1".to_string()),
server_name: "codex-apps".to_string(),
request: McpServerElicitationRequest::Form {
meta: Some(json!(ToolSuggestMeta {
codex_approval_kind: TOOL_SUGGEST_APPROVAL_KIND_VALUE,
persist: TOOL_SUGGEST_PERSIST_ALWAYS_VALUE,
meta: Some(json!(RequestPluginInstallMeta {
codex_approval_kind: REQUEST_PLUGIN_INSTALL_APPROVAL_KIND_VALUE,
persist: REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE,
tool_type: DiscoverableToolType::Connector,
suggest_type: DiscoverableToolAction::Install,
suggest_reason: "Plan and reference events from your calendar",
@@ -71,8 +71,8 @@ fn build_tool_suggestion_elicitation_request_uses_expected_shape() {
}
#[test]
fn build_tool_suggestion_elicitation_request_for_plugin_omits_install_url() {
let args = ToolSuggestArgs {
fn build_request_plugin_install_elicitation_request_for_plugin_omits_install_url() {
let args = RequestPluginInstallArgs {
tool_type: DiscoverableToolType::Plugin,
action_type: DiscoverableToolAction::Install,
tool_id: "sample@openai-curated".to_string(),
@@ -87,7 +87,7 @@ fn build_tool_suggestion_elicitation_request_for_plugin_omits_install_url() {
app_connector_ids: vec!["connector_calendar".to_string()],
}));
let request = build_tool_suggestion_elicitation_request(
let request = build_request_plugin_install_elicitation_request(
"codex-apps",
"thread-1".to_string(),
"turn-1".to_string(),
@@ -103,9 +103,9 @@ fn build_tool_suggestion_elicitation_request_for_plugin_omits_install_url() {
turn_id: Some("turn-1".to_string()),
server_name: "codex-apps".to_string(),
request: McpServerElicitationRequest::Form {
meta: Some(json!(ToolSuggestMeta {
codex_approval_kind: TOOL_SUGGEST_APPROVAL_KIND_VALUE,
persist: TOOL_SUGGEST_PERSIST_ALWAYS_VALUE,
meta: Some(json!(RequestPluginInstallMeta {
codex_approval_kind: REQUEST_PLUGIN_INSTALL_APPROVAL_KIND_VALUE,
persist: REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE,
tool_type: DiscoverableToolType::Plugin,
suggest_type: DiscoverableToolAction::Install,
suggest_reason: "Use the sample plugin's skills and MCP server",
@@ -126,8 +126,8 @@ fn build_tool_suggestion_elicitation_request_for_plugin_omits_install_url() {
}
#[test]
fn build_tool_suggestion_meta_uses_expected_shape() {
let meta = build_tool_suggestion_meta(
fn build_request_plugin_install_meta_uses_expected_shape() {
let meta = build_request_plugin_install_meta(
DiscoverableToolType::Connector,
DiscoverableToolAction::Install,
"Find and reference emails from your inbox",
@@ -138,9 +138,9 @@ fn build_tool_suggestion_meta_uses_expected_shape() {
assert_eq!(
meta,
ToolSuggestMeta {
codex_approval_kind: TOOL_SUGGEST_APPROVAL_KIND_VALUE,
persist: TOOL_SUGGEST_PERSIST_ALWAYS_VALUE,
RequestPluginInstallMeta {
codex_approval_kind: REQUEST_PLUGIN_INSTALL_APPROVAL_KIND_VALUE,
persist: REQUEST_PLUGIN_INSTALL_PERSIST_ALWAYS_VALUE,
tool_type: DiscoverableToolType::Connector,
suggest_type: DiscoverableToolAction::Install,
suggest_reason: "Find and reference emails from your inbox",
@@ -154,7 +154,7 @@ fn build_tool_suggestion_meta_uses_expected_shape() {
}
#[test]
fn verified_connector_suggestion_completed_requires_accessible_connector() {
fn verified_connector_install_completed_requires_accessible_connector() {
let accessible_connectors = vec![AppInfo {
id: "calendar".to_string(),
name: "Google Calendar".to_string(),
@@ -171,18 +171,18 @@ fn verified_connector_suggestion_completed_requires_accessible_connector() {
plugin_display_names: Vec::new(),
}];
assert!(verified_connector_suggestion_completed(
assert!(verified_connector_install_completed(
"calendar",
&accessible_connectors,
));
assert!(!verified_connector_suggestion_completed(
assert!(!verified_connector_install_completed(
"gmail",
&accessible_connectors,
));
}
#[test]
fn all_suggested_connectors_picked_up_requires_every_expected_connector() {
fn all_requested_connectors_picked_up_requires_every_expected_connector() {
let accessible_connectors = vec![AppInfo {
id: "calendar".to_string(),
name: "Google Calendar".to_string(),
@@ -199,11 +199,11 @@ fn all_suggested_connectors_picked_up_requires_every_expected_connector() {
plugin_display_names: Vec::new(),
}];
assert!(all_suggested_connectors_picked_up(
assert!(all_requested_connectors_picked_up(
&["calendar".to_string()],
&accessible_connectors,
));
assert!(!all_suggested_connectors_picked_up(
assert!(!all_requested_connectors_picked_up(
&["calendar".to_string(), "gmail".to_string()],
&accessible_connectors,
));

View File

@@ -15,7 +15,7 @@ use std::collections::BTreeMap;
const TUI_CLIENT_NAME: &str = "codex-tui";
pub const TOOL_SEARCH_TOOL_NAME: &str = "tool_search";
pub const TOOL_SEARCH_DEFAULT_LIMIT: usize = 8;
pub const TOOL_SUGGEST_TOOL_NAME: &str = "tool_suggest";
pub const REQUEST_PLUGIN_INSTALL_TOOL_NAME: &str = "request_plugin_install";
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct ToolSearchSourceInfo {
@@ -111,7 +111,7 @@ impl From<DiscoverablePluginInfo> for DiscoverableTool {
}
}
pub fn filter_tool_suggest_discoverable_tools_for_client(
pub fn filter_request_plugin_install_discoverable_tools_for_client(
discoverable_tools: Vec<DiscoverableTool>,
app_server_client_name: Option<&str>,
) -> Vec<DiscoverableTool> {
@@ -136,7 +136,7 @@ pub struct DiscoverablePluginInfo {
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct ToolSuggestEntry {
pub struct RequestPluginInstallEntry {
pub id: String,
pub name: String,
pub description: Option<String>,
@@ -271,7 +271,9 @@ pub fn collect_tool_search_source_infos<'a>(
.collect()
}
pub fn create_tool_suggest_tool(discoverable_tools: &[ToolSuggestEntry]) -> ToolSpec {
pub fn create_request_plugin_install_tool(
discoverable_tools: &[RequestPluginInstallEntry],
) -> ToolSpec {
let properties = BTreeMap::from([
(
"tool_type".to_string(),
@@ -291,7 +293,7 @@ pub fn create_tool_suggest_tool(discoverable_tools: &[ToolSuggestEntry]) -> Tool
(
"suggest_reason".to_string(),
JsonSchema::string(Some(
"Concise one-line user-facing reason why this tool can help with the current request."
"Concise one-line user-facing reason why this plugin or connector can help with the current request."
.to_string(),
)),
),
@@ -299,11 +301,11 @@ pub fn create_tool_suggest_tool(discoverable_tools: &[ToolSuggestEntry]) -> Tool
let discoverable_tools = format_discoverable_tools(discoverable_tools);
let description = format!(
"# Tool suggestion discovery\n\nUse this tool only to ask the user to install one known plugin or connector from the list below. The list contains known candidates that are not currently installed.\n\nUse this ONLY when all of the following are true:\n- The user explicitly wants a specific plugin or connector that is not already available in the current context or active `tools` list.\n- `{TOOL_SEARCH_TOOL_NAME}` is not available, or it has already been called and did not find or make the requested tool callable.\n- The tool is one of the known installable plugins or connectors listed below. Only ask to install tools from this list.\n\nDo not use tool suggestion for adjacent capabilities, broad recommendations, or tools that merely seem useful. The user's intent must clearly match one listed tool.\n\nKnown plugins/connectors available to install:\n{discoverable_tools}\n\nWorkflow:\n\n1. Check the current context and active `tools` list first. If `{TOOL_SEARCH_TOOL_NAME}` is available, call `{TOOL_SEARCH_TOOL_NAME}` before calling `{TOOL_SUGGEST_TOOL_NAME}`. Do not use tool suggestion if the needed tool is already available, found through `{TOOL_SEARCH_TOOL_NAME}`, or callable after discovery.\n2. Match the user's explicit request against the known plugin/connector list above. Only proceed when one listed plugin or connector exactly fits.\n3. If we found both connectors and plugins to suggest, use plugins first, only use connectors if the corresponding plugin is installed but the connector is not.\n4. If one tool clearly fits, call `{TOOL_SUGGEST_TOOL_NAME}` with:\n - `tool_type`: `connector` or `plugin`\n - `action_type`: `install`\n - `tool_id`: exact id from the known plugin/connector list above\n - `suggest_reason`: concise one-line user-facing reason this tool can help with the current request\n5. After the suggestion flow completes:\n - if the user finished the install flow, continue by searching again or using the newly available tool\n - if the user did not finish, continue without that tool, and don't suggest that tool again unless the user explicitly asks for it.\n\nIMPORTANT: DO NOT call this tool in parallel with other tools."
"# Request plugin/connector install\n\nUse this tool only to ask the user to install one known plugin or connector from the list below. The list contains known candidates that are not currently installed.\n\nUse this ONLY when all of the following are true:\n- The user explicitly asks to use a specific plugin or connector that is not already available in the current context or active `tools` list.\n- `{TOOL_SEARCH_TOOL_NAME}` is not available, or it has already been called and did not find or make the requested tool callable.\n- The plugin or connector is one of the known installable plugins or connectors listed below. Only ask to install plugins or connectors from this list.\n\nDo not use this tool for adjacent capabilities, broad recommendations, or tools that merely seem useful. Only use when the user explicitly asks to use that exact listed plugin or connector.\n\nKnown plugins/connectors available to install:\n{discoverable_tools}\n\nWorkflow:\n\n1. Check the current context and active `tools` list first. If current active tools aren't relevant and `{TOOL_SEARCH_TOOL_NAME}` is available, only call this tool after `{TOOL_SEARCH_TOOL_NAME}` has already been tried and found no relevant tool.\n2. Match the user's explicit request against the known plugin/connector list above. Only proceed when one listed plugin or connector exactly fits.\n3. If we found both connectors and plugins to install, use plugins first, only use connectors if the corresponding plugin is installed but the connector is not.\n4. If one plugin or connector clearly fits, call `{REQUEST_PLUGIN_INSTALL_TOOL_NAME}` with:\n - `tool_type`: `connector` or `plugin`\n - `action_type`: `install`\n - `tool_id`: exact id from the known plugin/connector list above\n - `suggest_reason`: concise one-line user-facing reason this plugin or connector can help with the current request\n5. After the request flow completes:\n - if the user finished the install flow, continue by searching again or using the newly available plugin or connector\n - if the user did not finish, continue without that plugin or connector, and don't request it again unless the user explicitly asks for it.\n\nIMPORTANT: DO NOT call this tool in parallel with other tools."
);
ToolSpec::Function(ResponsesApiTool {
name: TOOL_SUGGEST_TOOL_NAME.to_string(),
name: REQUEST_PLUGIN_INSTALL_TOOL_NAME.to_string(),
description,
strict: false,
defer_loading: None,
@@ -321,13 +323,13 @@ pub fn create_tool_suggest_tool(discoverable_tools: &[ToolSuggestEntry]) -> Tool
})
}
pub fn collect_tool_suggest_entries(
pub fn collect_request_plugin_install_entries(
discoverable_tools: &[DiscoverableTool],
) -> Vec<ToolSuggestEntry> {
) -> Vec<RequestPluginInstallEntry> {
discoverable_tools
.iter()
.map(|tool| match tool {
DiscoverableTool::Connector(connector) => ToolSuggestEntry {
DiscoverableTool::Connector(connector) => RequestPluginInstallEntry {
id: connector.id.clone(),
name: connector.name.clone(),
description: connector.description.clone(),
@@ -336,7 +338,7 @@ pub fn collect_tool_suggest_entries(
mcp_server_names: Vec::new(),
app_connector_ids: Vec::new(),
},
DiscoverableTool::Plugin(plugin) => ToolSuggestEntry {
DiscoverableTool::Plugin(plugin) => RequestPluginInstallEntry {
id: plugin.id.clone(),
name: plugin.name.clone(),
description: plugin.description.clone(),
@@ -349,7 +351,7 @@ pub fn collect_tool_suggest_entries(
.collect()
}
fn format_discoverable_tools(discoverable_tools: &[ToolSuggestEntry]) -> String {
fn format_discoverable_tools(discoverable_tools: &[RequestPluginInstallEntry]) -> String {
let mut discoverable_tools = discoverable_tools.to_vec();
discoverable_tools.sort_by(|left, right| {
left.name
@@ -373,7 +375,7 @@ fn format_discoverable_tools(discoverable_tools: &[ToolSuggestEntry]) -> String
.join("\n")
}
fn tool_description_or_fallback(tool: &ToolSuggestEntry) -> String {
fn tool_description_or_fallback(tool: &RequestPluginInstallEntry) -> String {
if let Some(description) = tool
.description
.as_deref()
@@ -389,7 +391,7 @@ fn tool_description_or_fallback(tool: &ToolSuggestEntry) -> String {
}
}
fn plugin_summary(tool: &ToolSuggestEntry) -> String {
fn plugin_summary(tool: &RequestPluginInstallEntry) -> String {
let mut details = Vec::new();
if tool.has_skills {
details.push("skills".to_string());

View File

@@ -49,36 +49,36 @@ fn create_tool_search_tool_deduplicates_and_renders_enabled_sources() {
}
#[test]
fn create_tool_suggest_tool_uses_plugin_summary_fallback() {
fn create_request_plugin_install_tool_uses_plugin_summary_fallback() {
let expected_description = concat!(
"# Tool suggestion discovery\n\n",
"# Request plugin/connector install\n\n",
"Use this tool only to ask the user to install one known plugin or connector from the list below. The list contains known candidates that are not currently installed.\n\n",
"Use this ONLY when all of the following are true:\n",
"- The user explicitly wants a specific plugin or connector that is not already available in the current context or active `tools` list.\n",
"- The user explicitly asks to use a specific plugin or connector that is not already available in the current context or active `tools` list.\n",
"- `tool_search` is not available, or it has already been called and did not find or make the requested tool callable.\n",
"- The tool is one of the known installable plugins or connectors listed below. Only ask to install tools from this list.\n\n",
"Do not use tool suggestion for adjacent capabilities, broad recommendations, or tools that merely seem useful. The user's intent must clearly match one listed tool.\n\n",
"- The plugin or connector is one of the known installable plugins or connectors listed below. Only ask to install plugins or connectors from this list.\n\n",
"Do not use this tool for adjacent capabilities, broad recommendations, or tools that merely seem useful. Only use when the user explicitly asks to use that exact listed plugin or connector.\n\n",
"Known plugins/connectors available to install:\n",
"- GitHub (id: `github`, type: plugin, action: install): skills; MCP servers: github-mcp; app connectors: github-app\n",
"- Slack (id: `slack@openai-curated`, type: connector, action: install): No description provided.\n\n",
"Workflow:\n\n",
"1. Check the current context and active `tools` list first. If `tool_search` is available, call `tool_search` before calling `tool_suggest`. Do not use tool suggestion if the needed tool is already available, found through `tool_search`, or callable after discovery.\n",
"1. Check the current context and active `tools` list first. If current active tools aren't relevant and `tool_search` is available, only call this tool after `tool_search` has already been tried and found no relevant tool.\n",
"2. Match the user's explicit request against the known plugin/connector list above. Only proceed when one listed plugin or connector exactly fits.\n",
"3. If we found both connectors and plugins to suggest, use plugins first, only use connectors if the corresponding plugin is installed but the connector is not.\n",
"4. If one tool clearly fits, call `tool_suggest` with:\n",
"3. If we found both connectors and plugins to install, use plugins first, only use connectors if the corresponding plugin is installed but the connector is not.\n",
"4. If one plugin or connector clearly fits, call `request_plugin_install` with:\n",
" - `tool_type`: `connector` or `plugin`\n",
" - `action_type`: `install`\n",
" - `tool_id`: exact id from the known plugin/connector list above\n",
" - `suggest_reason`: concise one-line user-facing reason this tool can help with the current request\n",
"5. After the suggestion flow completes:\n",
" - if the user finished the install flow, continue by searching again or using the newly available tool\n",
" - if the user did not finish, continue without that tool, and don't suggest that tool again unless the user explicitly asks for it.\n\n",
" - `suggest_reason`: concise one-line user-facing reason this plugin or connector can help with the current request\n",
"5. After the request flow completes:\n",
" - if the user finished the install flow, continue by searching again or using the newly available plugin or connector\n",
" - if the user did not finish, continue without that plugin or connector, and don't request it again unless the user explicitly asks for it.\n\n",
"IMPORTANT: DO NOT call this tool in parallel with other tools.",
);
assert_eq!(
create_tool_suggest_tool(&[
ToolSuggestEntry {
create_request_plugin_install_tool(&[
RequestPluginInstallEntry {
id: "slack@openai-curated".to_string(),
name: "Slack".to_string(),
description: None,
@@ -87,7 +87,7 @@ fn create_tool_suggest_tool_uses_plugin_summary_fallback() {
mcp_server_names: Vec::new(),
app_connector_ids: Vec::new(),
},
ToolSuggestEntry {
RequestPluginInstallEntry {
id: "github".to_string(),
name: "GitHub".to_string(),
description: None,
@@ -98,7 +98,7 @@ fn create_tool_suggest_tool_uses_plugin_summary_fallback() {
},
]),
ToolSpec::Function(ResponsesApiTool {
name: "tool_suggest".to_string(),
name: "request_plugin_install".to_string(),
description: expected_description.to_string(),
strict: false,
defer_loading: None,
@@ -113,7 +113,7 @@ fn create_tool_suggest_tool_uses_plugin_summary_fallback() {
(
"suggest_reason".to_string(),
JsonSchema::string(Some(
"Concise one-line user-facing reason why this tool can help with the current request."
"Concise one-line user-facing reason why this plugin or connector can help with the current request."
.to_string(),
),),
),
@@ -157,7 +157,7 @@ fn discoverable_tool_enums_use_expected_wire_names() {
}
#[test]
fn filter_tool_suggest_discoverable_tools_for_codex_tui_omits_plugins() {
fn filter_request_plugin_install_discoverable_tools_for_codex_tui_omits_plugins() {
let discoverable_tools = vec![
DiscoverableTool::Connector(Box::new(AppInfo {
id: "connector_google_calendar".to_string(),
@@ -185,7 +185,10 @@ fn filter_tool_suggest_discoverable_tools_for_codex_tui_omits_plugins() {
];
assert_eq!(
filter_tool_suggest_discoverable_tools_for_client(discoverable_tools, Some("codex-tui"),),
filter_request_plugin_install_discoverable_tools_for_client(
discoverable_tools,
Some("codex-tui"),
),
vec![DiscoverableTool::Connector(Box::new(AppInfo {
id: "connector_google_calendar".to_string(),
name: "Google Calendar".to_string(),

View File

@@ -1,4 +1,5 @@
use crate::CommandToolOptions;
use crate::REQUEST_PLUGIN_INSTALL_TOOL_NAME;
use crate::REQUEST_USER_INPUT_TOOL_NAME;
use crate::ResponsesApiNamespace;
use crate::ResponsesApiNamespaceTool;
@@ -6,7 +7,6 @@ use crate::ShellToolOptions;
use crate::SpawnAgentToolOptions;
use crate::TOOL_SEARCH_DEFAULT_LIMIT;
use crate::TOOL_SEARCH_TOOL_NAME;
use crate::TOOL_SUGGEST_TOOL_NAME;
use crate::ToolHandlerKind;
use crate::ToolName;
use crate::ToolRegistryPlan;
@@ -19,8 +19,8 @@ use crate::ViewImageToolOptions;
use crate::WebSearchToolOptions;
use crate::coalesce_loadable_tool_specs;
use crate::collect_code_mode_exec_prompt_tool_definitions;
use crate::collect_request_plugin_install_entries;
use crate::collect_tool_search_source_infos;
use crate::collect_tool_suggest_entries;
use crate::create_apply_patch_freeform_tool;
use crate::create_apply_patch_json_tool;
use crate::create_close_agent_tool_v1;
@@ -39,6 +39,7 @@ use crate::create_local_shell_tool;
use crate::create_read_mcp_resource_tool;
use crate::create_report_agent_job_result_tool;
use crate::create_request_permissions_tool;
use crate::create_request_plugin_install_tool;
use crate::create_request_user_input_tool;
use crate::create_resume_agent_tool;
use crate::create_send_input_tool_v1;
@@ -50,7 +51,6 @@ use crate::create_spawn_agent_tool_v2;
use crate::create_spawn_agents_on_csv_tool;
use crate::create_test_sync_tool;
use crate::create_tool_search_tool;
use crate::create_tool_suggest_tool;
use crate::create_update_goal_tool;
use crate::create_update_plan_tool;
use crate::create_view_image_tool;
@@ -312,11 +312,16 @@ pub fn build_tool_registry_plan(
params.discoverable_tools.filter(|tools| !tools.is_empty())
{
plan.push_spec(
create_tool_suggest_tool(&collect_tool_suggest_entries(discoverable_tools)),
create_request_plugin_install_tool(&collect_request_plugin_install_entries(
discoverable_tools,
)),
/*supports_parallel_tool_calls*/ true,
/*code_mode_enabled*/ false,
);
plan.register_handler(TOOL_SUGGEST_TOOL_NAME, ToolHandlerKind::ToolSuggest);
plan.register_handler(
REQUEST_PLUGIN_INSTALL_TOOL_NAME,
ToolHandlerKind::RequestPluginInstall,
);
}
if config.has_environment

View File

@@ -1692,7 +1692,7 @@ fn search_tool_keeps_plain_deferred_dynamic_tools_when_namespace_tools_are_disab
}
#[test]
fn tool_suggest_is_not_registered_without_feature_flag() {
fn request_plugin_install_is_not_registered_without_feature_flag() {
let model_info = search_capable_model_info();
let mut features = Features::with_defaults();
features.enable(Feature::ToolSearch);
@@ -1725,12 +1725,12 @@ fn tool_suggest_is_not_registered_without_feature_flag() {
assert!(
!tools
.iter()
.any(|tool| tool.name() == TOOL_SUGGEST_TOOL_NAME)
.any(|tool| tool.name() == REQUEST_PLUGIN_INSTALL_TOOL_NAME)
);
}
#[test]
fn tool_suggest_can_be_registered_without_search_tool() {
fn request_plugin_install_can_be_registered_without_search_tool() {
let model_info = ModelInfo {
supports_search_tool: false,
..search_capable_model_info()
@@ -1762,12 +1762,13 @@ fn tool_suggest_can_be_registered_without_search_tool() {
&[],
);
assert_contains_tool_names(&tools, &[TOOL_SUGGEST_TOOL_NAME]);
let tool_suggest = find_tool(&tools, TOOL_SUGGEST_TOOL_NAME);
assert!(tool_suggest.supports_parallel_tool_calls);
assert_contains_tool_names(&tools, &[REQUEST_PLUGIN_INSTALL_TOOL_NAME]);
let request_plugin_install = find_tool(&tools, REQUEST_PLUGIN_INSTALL_TOOL_NAME);
assert!(request_plugin_install.supports_parallel_tool_calls);
assert_lacks_tool_name(&tools, TOOL_SEARCH_TOOL_NAME);
let ToolSpec::Function(ResponsesApiTool { description, .. }) = &tool_suggest.spec else {
let ToolSpec::Function(ResponsesApiTool { description, .. }) = &request_plugin_install.spec
else {
panic!("expected function tool");
};
assert!(description.contains(
@@ -1779,7 +1780,7 @@ fn tool_suggest_can_be_registered_without_search_tool() {
}
#[test]
fn tool_suggest_description_lists_discoverable_tools() {
fn request_plugin_install_description_lists_discoverable_tools() {
let model_info = search_capable_model_info();
let mut features = Features::with_defaults();
features.enable(Feature::Apps);
@@ -1827,16 +1828,16 @@ fn tool_suggest_description_lists_discoverable_tools() {
&[],
);
assert!(handlers.contains(&ToolHandlerSpec {
name: ToolName::plain(TOOL_SUGGEST_TOOL_NAME),
kind: ToolHandlerKind::ToolSuggest,
name: ToolName::plain(REQUEST_PLUGIN_INSTALL_TOOL_NAME),
kind: ToolHandlerKind::RequestPluginInstall,
}));
let tool_suggest = find_tool(&tools, TOOL_SUGGEST_TOOL_NAME);
let request_plugin_install = find_tool(&tools, REQUEST_PLUGIN_INSTALL_TOOL_NAME);
let ToolSpec::Function(ResponsesApiTool {
description,
parameters,
..
}) = &tool_suggest.spec
}) = &request_plugin_install.spec
else {
panic!("expected function tool");
};
@@ -1855,30 +1856,27 @@ fn tool_suggest_description_lists_discoverable_tools() {
);
assert!(
description.contains(
"The user explicitly wants a specific plugin or connector that is not already available in the current context or active `tools` list."
"The user explicitly asks to use a specific plugin or connector that is not already available in the current context or active `tools` list."
)
);
assert!(description.contains(
"`tool_search` is not available, or it has already been called and did not find or make the requested tool callable."
));
assert!(description.contains(
"The tool is one of the known installable plugins or connectors listed below. Only ask to install tools from this list."
"The plugin or connector is one of the known installable plugins or connectors listed below. Only ask to install plugins or connectors from this list."
));
assert!(description.contains(
"Do not use tool suggestion for adjacent capabilities, broad recommendations, or tools that merely seem useful."
"Do not use this tool for adjacent capabilities, broad recommendations, or tools that merely seem useful."
));
assert!(description.contains("IMPORTANT: DO NOT call this tool in parallel with other tools."));
assert!(description.contains(
"Do not use tool suggestion if the needed tool is already available, found through `tool_search`, or callable after discovery."
));
assert!(description.contains(
"If `tool_search` is available, call `tool_search` before calling `tool_suggest`."
"If current active tools aren't relevant and `tool_search` is available, only call this tool after `tool_search` has already been tried and found no relevant tool."
));
assert!(!description.contains("targeted lookup"));
assert!(!description.contains("broad or speculative searches"));
assert!(description.contains("Only proceed when one listed plugin or connector exactly fits."));
assert!(description.contains(
"If we found both connectors and plugins to suggest, use plugins first, only use connectors if the corresponding plugin is installed but the connector is not."
"If we found both connectors and plugins to install, use plugins first, only use connectors if the corresponding plugin is installed but the connector is not."
));
assert!(!description.contains("{{discoverable_tools}}"));
assert!(!description.contains("tool_search fails to find a good match"));

View File

@@ -35,7 +35,7 @@ pub enum ToolHandlerKind {
SpawnAgentV2,
TestSync,
ToolSearch,
ToolSuggest,
RequestPluginInstall,
UnifiedExec,
ViewImage,
WaitAgentV1,

View File

@@ -201,16 +201,48 @@ where
tracing::warn!("failed to read initial cursor position; defaulting to origin: {err}");
Position { x: 0, y: 0 }
});
Ok(Self {
Ok(Self::with_screen_size_and_cursor_position(
backend,
screen_size,
cursor_pos,
))
}
/// Creates a new [`Terminal`] from a caller-provided initial cursor position.
///
/// Startup code uses this when cursor probing has already happened outside the backend, for
/// example through a bounded terminal probe. Supplying a stale or synthetic position changes
/// the inline viewport anchor, so callers should only use this after they have chosen the same
/// fallback they want the first render to honor.
pub fn with_options_and_cursor_position(backend: B, cursor_pos: Position) -> io::Result<Self> {
let screen_size = backend.size()?;
Ok(Self::with_screen_size_and_cursor_position(
backend,
screen_size,
cursor_pos,
))
}
fn with_screen_size_and_cursor_position(
backend: B,
screen_size: Size,
cursor_pos: Position,
) -> Self {
Self {
backend,
buffers: [Buffer::empty(Rect::ZERO), Buffer::empty(Rect::ZERO)],
current: 0,
hidden_cursor: false,
viewport_area: Rect::new(0, cursor_pos.y, 0, 0),
viewport_area: Rect::new(
/*x*/ 0,
cursor_pos.y,
/*width*/ 0,
/*height*/ 0,
),
last_known_screen_size: screen_size,
last_known_cursor_pos: cursor_pos,
visible_history_rows: 0,
})
}
}
/// Get a Frame object which provides a consistent view into the terminal state for rendering.

View File

@@ -166,6 +166,7 @@ mod status_indicator_widget;
mod streaming;
mod style;
mod terminal_palette;
mod terminal_probe;
mod terminal_title;
mod text_formatting;
mod theme_picker;

View File

@@ -99,12 +99,6 @@ mod imp {
}
self.value
}
fn refresh_with(&mut self, mut init: impl FnMut() -> Option<T>) -> Option<T> {
self.value = init();
self.attempted = true;
self.value
}
}
fn default_colors_cache() -> &'static Mutex<Cache<DefaultColors>> {
@@ -115,7 +109,7 @@ mod imp {
pub(super) fn default_colors() -> Option<DefaultColors> {
let cache = default_colors_cache();
let mut cache = cache.lock().ok()?;
cache.get_or_init_with(|| query_default_colors().unwrap_or_default())
cache.get_or_init_with(query_default_colors)
}
pub(super) fn requery_default_colors() {
@@ -124,14 +118,36 @@ mod imp {
if cache.attempted && cache.value.is_none() {
return;
}
cache.refresh_with(|| query_default_colors().unwrap_or_default());
// Focus events arrive after crossterm's event stream is active. Requery through
// crossterm here so unrelated input stays in crossterm's skipped-event queue instead
// of being consumed by the bounded startup probe's direct tty reads.
let fg = query_foreground_color()
.ok()
.flatten()
.and_then(color_to_tuple);
let bg = query_background_color()
.ok()
.flatten()
.and_then(color_to_tuple);
cache.value = fg.zip(bg).map(|(fg, bg)| DefaultColors { fg, bg });
cache.attempted = true;
}
}
fn query_default_colors() -> std::io::Result<Option<DefaultColors>> {
let fg = query_foreground_color()?.and_then(color_to_tuple);
let bg = query_background_color()?.and_then(color_to_tuple);
Ok(fg.zip(bg).map(|(fg, bg)| DefaultColors { fg, bg }))
/// Queries terminal default colors through the bounded startup probe path.
///
/// The palette cache treats `None` as an attempted-but-unavailable result, so this function
/// collapses I/O errors and missing responses into the same fallback path used for terminals
/// that simply do not support OSC 10/11 queries.
fn query_default_colors() -> Option<DefaultColors> {
crate::terminal_probe::default_colors(crate::terminal_probe::DEFAULT_TIMEOUT)
.ok()
.flatten()
.map(|colors| DefaultColors {
fg: colors.fg,
bg: colors.bg,
})
}
fn color_to_tuple(color: CrosstermColor) -> Option<(u8, u8, u8)> {

View File

@@ -0,0 +1,563 @@
//! Short, best-effort terminal response probes for TUI startup.
//!
//! Crossterm's public helpers wait up to two seconds for terminal responses. That is too long for
//! TUI startup, where unsupported terminals should simply fall back to conservative defaults.
//! This module sends the same kinds of optional terminal queries with a caller-provided deadline,
//! prefers duplicated stdio handles, falls back to the controlling terminal path when stdio is
//! unavailable, and reports `None` when a response is unavailable.
//!
//! The probes run before the crossterm event stream is created, so they do not share crossterm's
//! internal skipped-event queue. Bytes read while looking for probe responses are consumed from the
//! terminal; keeping the timeout short is part of the contract that makes this acceptable for
//! startup. A future input-preservation layer would need to replay unrelated bytes through the same
//! parser that normal TUI input uses.
#[cfg(unix)]
#[cfg_attr(test, allow(dead_code))]
mod imp {
use std::fs::File;
use std::fs::OpenOptions;
use std::io;
use std::io::Write;
use std::os::fd::AsRawFd;
use std::os::fd::FromRawFd;
use std::time::Duration;
use std::time::Instant;
use crossterm::event::KeyboardEnhancementFlags;
use ratatui::layout::Position;
/// Default wall-clock budget for each startup probe group.
pub(crate) const DEFAULT_TIMEOUT: Duration = Duration::from_millis(100);
/// Default terminal foreground and background colors reported by OSC 10 and OSC 11.
#[derive(Debug, Clone, Copy, Eq, PartialEq)]
pub(crate) struct DefaultColors {
/// Default foreground color as an 8-bit RGB tuple.
pub(crate) fg: (u8, u8, u8),
/// Default background color as an 8-bit RGB tuple.
pub(crate) bg: (u8, u8, u8),
}
/// Temporary terminal handle used while a startup probe owns terminal input.
///
/// The preferred path is duplicated stdin/stdout, because terminal replies are delivered to the
/// same input stream crossterm reads from. Some embedded or redirected environments expose a
/// controlling terminal without terminal stdio; in that case the handle falls back to
/// `/dev/tty`. Only the reader is switched to nonblocking mode, and its original file status
/// flags are restored when the handle is dropped.
struct Tty {
reader: File,
writer: File,
original_flags: libc::c_int,
}
impl Tty {
/// Opens an isolated reader and writer for startup probes.
///
/// The reader and writer must be separate file descriptions so switching the reader into
/// nonblocking mode does not also make writes fail with `WouldBlock` under terminal
/// backpressure. Falling back to `/dev/tty` keeps embedded or redirected environments
/// usable when they still expose a controlling terminal.
fn open() -> io::Result<Self> {
let stdio_reader = dup_file(libc::STDIN_FILENO);
let stdio_writer = dup_file(libc::STDOUT_FILENO);
match (stdio_reader, stdio_writer) {
(Ok(reader), Ok(writer)) => Self::new(reader, writer),
(reader, writer) => {
let stdio_err = match (reader.err(), writer.err()) {
(Some(reader_err), Some(writer_err)) => {
format!("reader: {reader_err}; writer: {writer_err}")
}
(Some(reader_err), None) => format!("reader: {reader_err}"),
(None, Some(writer_err)) => format!("writer: {writer_err}"),
(None, None) => "unknown stdio duplicate error".to_string(),
};
let reader =
OpenOptions::new()
.read(true)
.open("/dev/tty")
.map_err(|fallback_err| {
io::Error::new(
fallback_err.kind(),
format!(
"failed to duplicate stdio ({stdio_err}) or open /dev/tty reader ({fallback_err})"
),
)
})?;
let writer = OpenOptions::new().write(true).open("/dev/tty").map_err(
|fallback_err| {
io::Error::new(
fallback_err.kind(),
format!(
"failed to duplicate stdio ({stdio_err}) or open /dev/tty writer ({fallback_err})"
),
)
},
)?;
Self::new(reader, writer)
}
}
}
fn new(reader: File, writer: File) -> io::Result<Self> {
let fd = reader.as_raw_fd();
let original_flags = unsafe { libc::fcntl(fd, libc::F_GETFL) };
if original_flags == -1 {
return Err(io::Error::last_os_error());
}
if unsafe { libc::fcntl(fd, libc::F_SETFL, original_flags | libc::O_NONBLOCK) } == -1 {
return Err(io::Error::last_os_error());
}
Ok(Self {
reader,
writer,
original_flags,
})
}
fn write_all(&mut self, bytes: &[u8]) -> io::Result<()> {
self.writer.write_all(bytes)?;
self.writer.flush()
}
fn read_available(&mut self, buffer: &mut Vec<u8>) -> io::Result<()> {
let mut chunk = [0_u8; 256];
loop {
let count = unsafe {
libc::read(
self.reader.as_raw_fd(),
chunk.as_mut_ptr().cast::<libc::c_void>(),
chunk.len(),
)
};
if count > 0 {
buffer.extend_from_slice(&chunk[..count as usize]);
continue;
}
if count == 0 {
return Ok(());
}
let err = io::Error::last_os_error();
if matches!(
err.kind(),
io::ErrorKind::WouldBlock | io::ErrorKind::Interrupted
) {
return Ok(());
}
return Err(err);
}
}
fn poll_readable(&self, timeout: Duration) -> io::Result<bool> {
let mut fd = libc::pollfd {
fd: self.reader.as_raw_fd(),
events: libc::POLLIN,
revents: 0,
};
let deadline = Instant::now() + timeout;
loop {
let now = Instant::now();
if now >= deadline {
return Ok(false);
}
let timeout_ms = deadline
.saturating_duration_since(now)
.as_millis()
.min(libc::c_int::MAX as u128) as libc::c_int;
let result = unsafe {
libc::poll(&mut fd, /*nfds*/ 1, timeout_ms)
};
if result > 0 {
return Ok((fd.revents & libc::POLLIN) != 0);
}
if result == 0 {
return Ok(false);
}
let err = io::Error::last_os_error();
if err.kind() != io::ErrorKind::Interrupted {
return Err(err);
}
}
}
}
impl Drop for Tty {
fn drop(&mut self) {
let _ =
unsafe { libc::fcntl(self.reader.as_raw_fd(), libc::F_SETFL, self.original_flags) };
}
}
/// Duplicates a process stdio descriptor so probe cleanup owns only the duplicate.
fn dup_file(fd: libc::c_int) -> io::Result<File> {
let duplicated = unsafe { libc::dup(fd) };
if duplicated == -1 {
return Err(io::Error::last_os_error());
}
Ok(unsafe { File::from_raw_fd(duplicated) })
}
/// Queries the current cursor position and returns a zero-based Ratatui position.
///
/// A timeout or a non-CPR response is not fatal. Callers should treat `Ok(None)` as "terminal
/// did not answer this optional query" and choose a conservative fallback.
pub(crate) fn cursor_position(timeout: Duration) -> io::Result<Option<Position>> {
let mut tty = Tty::open()?;
tty.write_all(b"\x1B[6n")?;
let Some(response) = read_until(&mut tty, timeout, parse_cursor_position)? else {
return Ok(None);
};
Ok(Some(response))
}
/// Queries OSC 10 and OSC 11 default colors under one shared deadline.
///
/// Foreground and background are only useful as a pair for palette calculations, so a missing
/// response from either slot returns `Ok(None)`. Both queries are sent before reading so a
/// terminal that supports palette replies gets the full bounded window to return both values,
/// while unsupported terminals still pay one bounded wait instead of one wait per slot.
pub(crate) fn default_colors(timeout: Duration) -> io::Result<Option<DefaultColors>> {
let mut tty = Tty::open()?;
tty.write_all(b"\x1B]10;?\x1B\\\x1B]11;?\x1B\\")?;
let Some(colors) = read_until(&mut tty, timeout, parse_default_colors)? else {
return Ok(None);
};
Ok(Some(colors))
}
/// Checks whether the terminal reports support for keyboard enhancement flags.
///
/// The probe sends the kitty keyboard-status query followed by primary-device-attributes as a
/// fallback. A PDA response proves that the terminal answered but does not prove that keyboard
/// enhancement is unsupported until the bounded wait has expired; flags that arrive later in
/// the same deadline must still win.
pub(crate) fn keyboard_enhancement_supported(timeout: Duration) -> io::Result<Option<bool>> {
let mut tty = Tty::open()?;
tty.write_all(b"\x1B[?u\x1B[c")?;
read_keyboard_enhancement_supported(&mut tty, timeout)
}
/// Reads available terminal bytes until `parse` recognizes a probe response or time expires.
///
/// The accumulated buffer may include unrelated terminal input. This helper intentionally does
/// not try to replay those bytes, so it must stay limited to short startup probes that run
/// before normal crossterm input polling begins.
fn read_until<T>(
tty: &mut Tty,
timeout: Duration,
mut parse: impl FnMut(&[u8]) -> Option<T>,
) -> io::Result<Option<T>> {
let deadline = Instant::now() + timeout;
let mut buffer = Vec::new();
loop {
tty.read_available(&mut buffer)?;
if let Some(value) = parse(&buffer) {
return Ok(Some(value));
}
let now = Instant::now();
if now >= deadline {
return Ok(None);
}
if !tty.poll_readable(deadline.saturating_duration_since(now))? {
return Ok(None);
}
}
}
/// Reads keyboard-enhancement responses while giving flags the full bounded window to arrive.
fn read_keyboard_enhancement_supported(
tty: &mut Tty,
timeout: Duration,
) -> io::Result<Option<bool>> {
let deadline = Instant::now() + timeout;
let mut buffer = Vec::new();
let mut saw_supported = false;
let mut saw_unsupported_fallback = false;
loop {
tty.read_available(&mut buffer)?;
match parse_keyboard_enhancement_support(&buffer) {
KeyboardProbeState::SupportedAndFallback => return Ok(Some(true)),
KeyboardProbeState::Supported => saw_supported = true,
KeyboardProbeState::UnsupportedFallback => saw_unsupported_fallback = true,
KeyboardProbeState::Pending => {}
}
if saw_supported && saw_unsupported_fallback {
return Ok(Some(true));
}
let now = Instant::now();
if now >= deadline {
if saw_supported {
return Ok(Some(true));
}
return Ok(saw_unsupported_fallback.then_some(false));
}
if !tty.poll_readable(deadline.saturating_duration_since(now))? {
if saw_supported {
return Ok(Some(true));
}
return Ok(saw_unsupported_fallback.then_some(false));
}
}
}
fn parse_cursor_position(buffer: &[u8]) -> Option<Position> {
for start in find_all_subslices(buffer, b"\x1B[") {
let rest = &buffer[start + 2..];
let Some(end) = rest.iter().position(|b| *b == b'R') else {
continue;
};
let Ok(payload) = std::str::from_utf8(&rest[..end]) else {
continue;
};
let Some((row, col)) = payload.split_once(';') else {
continue;
};
let Ok(row) = row.parse::<u16>() else {
continue;
};
let Ok(col) = col.parse::<u16>() else {
continue;
};
let row = row.saturating_sub(1);
let col = col.saturating_sub(1);
return Some(Position { x: col, y: row });
}
None
}
fn parse_osc_color(buffer: &[u8], slot: u8) -> Option<(u8, u8, u8)> {
let prefix = format!("\x1B]{slot};");
let start = find_subslice(buffer, prefix.as_bytes())?;
let payload_start = start + prefix.len();
let rest = &buffer[payload_start..];
let (payload_end, _terminator_len) = osc_payload_end(rest)?;
let payload = std::str::from_utf8(&rest[..payload_end]).ok()?;
parse_osc_rgb(payload)
}
fn parse_default_colors(buffer: &[u8]) -> Option<DefaultColors> {
let fg = parse_osc_color(buffer, /*slot*/ 10)?;
let bg = parse_osc_color(buffer, /*slot*/ 11)?;
Some(DefaultColors { fg, bg })
}
fn osc_payload_end(buffer: &[u8]) -> Option<(usize, usize)> {
let mut idx = 0;
while idx < buffer.len() {
match buffer[idx] {
0x07 => return Some((idx, 1)),
0x1B if buffer.get(idx + 1) == Some(&b'\\') => return Some((idx, 2)),
_ => idx += 1,
}
}
None
}
fn parse_osc_rgb(payload: &str) -> Option<(u8, u8, u8)> {
let (prefix, values) = payload.trim().split_once(':')?;
if !prefix.eq_ignore_ascii_case("rgb") && !prefix.eq_ignore_ascii_case("rgba") {
return None;
}
let mut parts = values.split('/');
let r = parse_osc_component(parts.next()?)?;
let g = parse_osc_component(parts.next()?)?;
let b = parse_osc_component(parts.next()?)?;
if prefix.eq_ignore_ascii_case("rgba") {
parse_osc_component(parts.next()?)?;
}
parts.next().is_none().then_some((r, g, b))
}
fn parse_osc_component(component: &str) -> Option<u8> {
match component.len() {
2 => u8::from_str_radix(component, 16).ok(),
4 => u16::from_str_radix(component, 16)
.ok()
.map(|value| (value / 257) as u8),
_ => None,
}
}
/// Parser state for the keyboard enhancement probe.
///
/// `UnsupportedFallback` records that a primary-device-attributes response arrived, but the
/// caller should keep waiting until the deadline because a later keyboard-flags response is
/// more specific. `Supported` records that keyboard flags arrived, but the caller should still
/// drain the PDA fallback response if it arrives before the deadline so those bytes do not leak
/// into the normal event stream.
#[derive(Debug, Clone, Copy, Eq, PartialEq)]
enum KeyboardProbeState {
Pending,
UnsupportedFallback,
Supported,
SupportedAndFallback,
}
fn parse_keyboard_enhancement_support(buffer: &[u8]) -> KeyboardProbeState {
match (
find_keyboard_flags(buffer).is_some(),
find_primary_device_attributes(buffer).is_some(),
) {
(true, true) => KeyboardProbeState::SupportedAndFallback,
(true, false) => KeyboardProbeState::Supported,
(false, true) => KeyboardProbeState::UnsupportedFallback,
(false, false) => KeyboardProbeState::Pending,
}
}
fn find_keyboard_flags(buffer: &[u8]) -> Option<KeyboardEnhancementFlags> {
for start in find_all_subslices(buffer, b"\x1B[?") {
let rest = &buffer[start + 3..];
let Some(end) = rest.iter().position(|b| *b == b'u') else {
continue;
};
if end == 0 {
continue;
}
let Ok(bits_text) = std::str::from_utf8(&rest[..end]) else {
continue;
};
let Ok(bits) = bits_text.parse::<u8>() else {
continue;
};
let mut flags = KeyboardEnhancementFlags::empty();
if bits & 1 != 0 {
flags |= KeyboardEnhancementFlags::DISAMBIGUATE_ESCAPE_CODES;
}
if bits & 2 != 0 {
flags |= KeyboardEnhancementFlags::REPORT_EVENT_TYPES;
}
if bits & 4 != 0 {
flags |= KeyboardEnhancementFlags::REPORT_ALTERNATE_KEYS;
}
if bits & 8 != 0 {
flags |= KeyboardEnhancementFlags::REPORT_ALL_KEYS_AS_ESCAPE_CODES;
}
return Some(flags);
}
None
}
fn find_primary_device_attributes(buffer: &[u8]) -> Option<()> {
for start in find_all_subslices(buffer, b"\x1B[?") {
let rest = &buffer[start + 3..];
let Some(end) = rest.iter().position(|b| *b == b'c') else {
continue;
};
if end > 0 && rest[..end].iter().all(|b| b.is_ascii_digit() || *b == b';') {
return Some(());
}
}
None
}
fn find_subslice(haystack: &[u8], needle: &[u8]) -> Option<usize> {
haystack
.windows(needle.len())
.position(|window| window == needle)
}
fn find_all_subslices<'a>(
haystack: &'a [u8],
needle: &'a [u8],
) -> impl Iterator<Item = usize> + 'a {
haystack
.windows(needle.len())
.enumerate()
.filter_map(move |(idx, window)| (window == needle).then_some(idx))
}
#[cfg(test)]
mod tests {
use super::*;
use pretty_assertions::assert_eq;
#[test]
fn parses_cursor_position_as_zero_based() {
assert_eq!(
parse_cursor_position(b"\x1B[20;10R"),
Some(Position { x: 9, y: 19 })
);
assert_eq!(
parse_cursor_position(b"\x1B[I\x1B[20;10R"),
Some(Position { x: 9, y: 19 })
);
}
#[test]
fn parses_osc_colors_with_bel_and_st() {
assert_eq!(
parse_osc_color(b"\x1B]10;rgb:ffff/8000/0000\x07", /*slot*/ 10),
Some((255, 127, 0))
);
assert_eq!(
parse_osc_color(b"\x1B]11;rgba:00/80/ff/ff\x1B\\", /*slot*/ 11),
Some((0, 128, 255))
);
}
#[test]
fn parses_two_and_four_digit_color_components() {
assert_eq!(parse_osc_rgb("rgb:00/80/ff"), Some((0, 128, 255)));
assert_eq!(
parse_osc_rgb("rgba:ffff/8000/0000/ffff"),
Some((255, 127, 0))
);
}
#[test]
fn parses_default_colors_from_one_buffer() {
assert_eq!(
parse_default_colors(
b"\x1B]10;rgb:eeee/eeee/eeee\x1B\\\x1B]11;rgb:1111/1111/1111\x07"
),
Some(DefaultColors {
fg: (238, 238, 238),
bg: (17, 17, 17)
})
);
assert_eq!(
parse_default_colors(
b"\x1B]11;rgb:1111/1111/1111\x07\x1B]10;rgb:eeee/eeee/eeee\x1B\\"
),
Some(DefaultColors {
fg: (238, 238, 238),
bg: (17, 17, 17)
})
);
assert_eq!(
parse_default_colors(b"\x1B]10;rgb:eeee/eeee/eeee\x1B\\"),
None
);
}
#[test]
fn parses_keyboard_enhancement_flags_and_pda_fallback() {
assert_eq!(
parse_keyboard_enhancement_support(b"\x1B[?7u"),
KeyboardProbeState::Supported
);
assert_eq!(
parse_keyboard_enhancement_support(b"\x1B[?64;1;2c"),
KeyboardProbeState::UnsupportedFallback
);
assert_eq!(
parse_keyboard_enhancement_support(b"\x1B[?64;1;2c\x1B[?7u"),
KeyboardProbeState::SupportedAndFallback
);
assert_eq!(
parse_keyboard_enhancement_support(b"\x1B[?7u\x1B[?64;1;2c"),
KeyboardProbeState::SupportedAndFallback
);
assert_eq!(
parse_keyboard_enhancement_support(b""),
KeyboardProbeState::Pending
);
}
}
}
#[cfg(unix)]
pub(crate) use imp::*;

View File

@@ -22,6 +22,7 @@ use crossterm::event::EnableFocusChange;
use crossterm::event::KeyEvent;
use crossterm::terminal::EnterAlternateScreen;
use crossterm::terminal::LeaveAlternateScreen;
#[cfg(not(unix))]
use crossterm::terminal::supports_keyboard_enhancement;
use ratatui::backend::Backend;
use ratatui::backend::CrosstermBackend;
@@ -289,11 +290,57 @@ pub fn init() -> Result<Terminal> {
set_panic_hook();
#[cfg(unix)]
let backend = CrosstermBackend::new(stdout());
let tui = CustomTerminal::with_options(backend)?;
#[cfg(unix)]
let cursor_pos =
match crate::terminal_probe::cursor_position(crate::terminal_probe::DEFAULT_TIMEOUT) {
Ok(Some(pos)) => pos,
Ok(None) => {
tracing::warn!("initial cursor position probe timed out; defaulting to origin");
Position { x: 0, y: 0 }
}
Err(err) => {
tracing::warn!(
"failed to read initial cursor position; defaulting to origin: {err}"
);
Position { x: 0, y: 0 }
}
};
#[cfg(not(unix))]
let mut backend = CrosstermBackend::new(stdout());
#[cfg(not(unix))]
let cursor_pos = cursor_position_with_crossterm(&mut backend);
let tui = CustomTerminal::with_options_and_cursor_position(backend, cursor_pos)?;
Ok(tui)
}
#[cfg(not(unix))]
fn cursor_position_with_crossterm(backend: &mut CrosstermBackend<Stdout>) -> Position {
backend.get_cursor_position().unwrap_or_else(|err| {
tracing::warn!("failed to read initial cursor position; defaulting to origin: {err}");
Position { x: 0, y: 0 }
})
}
#[cfg(unix)]
fn detect_keyboard_enhancement_supported() -> bool {
crate::terminal_probe::keyboard_enhancement_supported(crate::terminal_probe::DEFAULT_TIMEOUT)
.unwrap_or(/*default*/ None)
.unwrap_or(/*default*/ false)
}
#[cfg(not(unix))]
fn detect_keyboard_enhancement_supported() -> bool {
// Non-Unix startup keeps the existing crossterm path because the bounded probe implementation
// relies on Unix file descriptors and `/dev/tty` semantics.
supports_keyboard_enhancement().unwrap_or(/*default*/ false)
}
fn set_panic_hook() {
let hook = panic::take_hook();
panic::set_hook(Box::new(move |panic_info| {
@@ -346,7 +393,7 @@ impl Tui {
// Detect keyboard enhancement support before any EventStream is created so the
// crossterm poller can acquire its lock without contention.
let enhanced_keys_supported = !keyboard_modes::keyboard_enhancement_disabled()
&& supports_keyboard_enhancement().unwrap_or(false);
&& detect_keyboard_enhancement_supported();
// Cache this to avoid contention with the event reader.
supports_color::on_cached(supports_color::Stream::Stdout);
let _ = crate::terminal_palette::default_colors();

View File

@@ -0,0 +1,7 @@
load("//:defs.bzl", "codex_rust_crate")
codex_rust_crate(
name = "web-server",
crate_name = "codex_web_server",
compile_data = glob(["assets/**"]),
)

View File

@@ -0,0 +1,36 @@
[package]
name = "codex-web-server"
version.workspace = true
edition.workspace = true
license.workspace = true
[lib]
name = "codex_web_server"
path = "src/lib.rs"
[lints]
workspace = true
[dependencies]
anyhow = { workspace = true }
axum = { workspace = true, default-features = false, features = [
"http1",
"tokio",
"ws",
] }
clap = { workspace = true, features = ["derive"] }
codex-utils-pty = { workspace = true }
futures = { workspace = true }
include_dir = { workspace = true }
tokio = { workspace = true, features = [
"macros",
"net",
"rt-multi-thread",
] }
tracing = { workspace = true, features = ["log"] }
url = { workspace = true }
webbrowser = { workspace = true }
[dev-dependencies]
pretty_assertions = { workspace = true }
tokio-tungstenite = { workspace = true }

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,20 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Codex Web Terminal</title>
<script type="module" crossorigin src="/assets/index-DcK3rtbT.js"></script>
<link rel="stylesheet" crossorigin href="/assets/index-DZyd-Z17.css">
</head>
<body>
<main id="app" class="h-dvh bg-neutral-950 text-neutral-100">
<div id="terminal" class="h-full min-h-0"></div>
<div
id="status"
class="pointer-events-none fixed bottom-3 left-3 z-10 hidden max-w-[calc(100vw-1.5rem)] rounded border border-red-500/40 bg-neutral-950 px-3 py-2 text-sm text-red-200 shadow"
role="status"
></div>
</main>
</body>
</html>

View File

@@ -0,0 +1,557 @@
use anyhow::Context;
use axum::Router;
use axum::body::Body;
use axum::extract::ConnectInfo;
use axum::extract::State;
use axum::extract::ws::Message;
use axum::extract::ws::WebSocket;
use axum::extract::ws::WebSocketUpgrade;
use axum::http::HeaderMap;
use axum::http::StatusCode;
use axum::http::header;
use axum::response::IntoResponse;
use axum::response::Response;
use axum::routing::get;
use clap::Args;
use codex_utils_pty::ProcessHandle;
use codex_utils_pty::SpawnedProcess;
use codex_utils_pty::TerminalSize;
use codex_utils_pty::spawn_pty_process;
use futures::SinkExt;
use futures::StreamExt;
use include_dir::Dir;
use include_dir::include_dir;
use std::collections::HashMap;
use std::net::IpAddr;
use std::net::SocketAddr;
use std::path::PathBuf;
use std::sync::Arc;
use tokio::net::TcpListener;
use tokio::sync::oneshot;
use tokio::task::JoinHandle;
use tracing::debug;
use tracing::warn;
use url::Position;
use url::Url;
const INPUT_FRAME: u8 = 0x00;
const RESIZE_FRAME: u8 = 0x01;
const DEFAULT_TERMINAL_SIZE: TerminalSize = TerminalSize { rows: 24, cols: 80 };
static WEBUI_ASSETS: Dir<'_> = include_dir!("$CARGO_MANIFEST_DIR/assets");
/// Run Codex in a browser-backed terminal served by the current Codex binary.
#[derive(Clone, Debug, Args)]
pub struct WebCommand {
/// Address to bind. Only loopback addresses are accepted.
#[arg(long, default_value = "127.0.0.1:0", value_name = "ADDR")]
pub listen: SocketAddr,
/// Working directory for the Codex session.
#[arg(long, value_name = "DIR")]
pub cwd: Option<PathBuf>,
/// Open the served URL in the default browser.
#[arg(long)]
pub open: bool,
/// Internal test hook: command to spawn instead of this Codex executable.
#[arg(long, hide = true, value_name = "PATH")]
pub command: Option<PathBuf>,
/// Internal test hook: argument for --command. May be repeated.
#[arg(long = "command-arg", hide = true, value_name = "ARG")]
pub command_args: Vec<String>,
/// Arguments forwarded to the inner Codex TUI. Pass them after `--`.
#[arg(last = true, value_name = "CODEX_ARGS")]
pub codex_args: Vec<String>,
}
#[derive(Clone, Debug)]
pub struct ServerConfig {
listen: SocketAddr,
open: bool,
command: PathBuf,
args: Vec<String>,
cwd: PathBuf,
}
#[derive(Clone)]
struct ServerState {
config: Arc<ServerConfig>,
}
#[derive(Debug, PartialEq, Eq)]
pub enum ClientFrame<'a> {
Input(&'a [u8]),
Resize { cols: u16, rows: u16 },
}
#[derive(Debug, PartialEq, Eq)]
pub enum FrameDecodeError {
Empty,
MalformedResize,
ZeroResize,
UnknownFrameType(u8),
}
pub struct StaticAsset {
pub path: &'static str,
pub content_type: &'static str,
pub cache_control: &'static str,
pub bytes: &'static [u8],
}
impl WebCommand {
pub fn into_server_config(
self,
inherited_config_overrides: Vec<String>,
) -> anyhow::Result<ServerConfig> {
if !self.listen.ip().is_loopback() {
anyhow::bail!("codex web only accepts loopback --listen addresses");
}
let cwd = match self.cwd {
Some(cwd) => cwd,
None => std::env::current_dir().context("failed to read current directory")?,
};
let (command, args) = match self.command {
Some(command) => (command, self.command_args),
None => {
let command = std::env::current_exe().context("failed to resolve current exe")?;
let mut args = Vec::new();
for config_override in inherited_config_overrides {
args.push("-c".to_string());
args.push(config_override);
}
args.extend(self.codex_args);
(command, args)
}
};
Ok(ServerConfig {
listen: self.listen,
open: self.open,
command,
args,
cwd,
})
}
}
pub async fn run(
command: WebCommand,
inherited_config_overrides: Vec<String>,
) -> anyhow::Result<()> {
let config = command.into_server_config(inherited_config_overrides)?;
let listener = TcpListener::bind(config.listen)
.await
.with_context(|| format!("failed to bind codex web listener on {}", config.listen))?;
let url = http_url_for_addr(listener.local_addr()?);
println!("Codex web listening on {url}");
if config.open
&& let Err(err) = webbrowser::open(&url)
{
eprintln!("Failed to open browser for {url}: {err}");
}
serve_listener(listener, config).await
}
pub async fn serve_listener(listener: TcpListener, config: ServerConfig) -> anyhow::Result<()> {
let state = ServerState {
config: Arc::new(config),
};
let router = Router::new()
.route("/healthz", get(healthz))
.route("/api/pty", get(pty_websocket))
.fallback(get(static_handler))
.with_state(state);
axum::serve(
listener,
router.into_make_service_with_connect_info::<SocketAddr>(),
)
.await
.context("codex web server failed")
}
async fn healthz() -> StatusCode {
StatusCode::OK
}
async fn pty_websocket(
websocket: WebSocketUpgrade,
ConnectInfo(peer_addr): ConnectInfo<SocketAddr>,
State(state): State<ServerState>,
headers: HeaderMap,
) -> Response {
if !origin_is_allowed(&headers) {
warn!(%peer_addr, "rejecting codex web websocket due to Origin mismatch");
return StatusCode::FORBIDDEN.into_response();
}
websocket
.on_upgrade(move |socket| handle_pty_socket(socket, state.config))
.into_response()
}
async fn static_handler(request: axum::http::Request<Body>) -> Response {
let path = request.uri().path();
if path.starts_with("/api/") {
return StatusCode::NOT_FOUND.into_response();
}
match static_asset_for_path(path) {
Some(asset) => (
[
(header::CONTENT_TYPE, asset.content_type),
(header::CACHE_CONTROL, asset.cache_control),
],
asset.bytes,
)
.into_response(),
None => StatusCode::NOT_FOUND.into_response(),
}
}
pub fn static_asset_for_path(request_path: &str) -> Option<StaticAsset> {
let asset_path = normalize_asset_path(request_path)?;
let file = WEBUI_ASSETS
.get_file(&asset_path)
.or_else(|| WEBUI_ASSETS.get_file("index.html"))?;
let path = file.path().to_str()?;
Some(StaticAsset {
path,
content_type: content_type_for_path(path),
cache_control: if path == "index.html" {
"no-store"
} else {
"public, max-age=31536000, immutable"
},
bytes: file.contents(),
})
}
fn normalize_asset_path(request_path: &str) -> Option<String> {
let trimmed = request_path.strip_prefix('/').unwrap_or(request_path);
let path = if trimmed.is_empty() {
"index.html"
} else {
trimmed
};
if path
.split('/')
.any(|component| component.is_empty() || component == "." || component == "..")
|| path.contains('\\')
{
return None;
}
Some(path.to_string())
}
fn content_type_for_path(path: &str) -> &'static str {
match path.rsplit_once('.').map(|(_, extension)| extension) {
Some("css") => "text/css; charset=utf-8",
Some("html") => "text/html; charset=utf-8",
Some("js") => "text/javascript; charset=utf-8",
Some("json") | Some("map") => "application/json; charset=utf-8",
Some("svg") => "image/svg+xml",
Some("wasm") => "application/wasm",
_ => "application/octet-stream",
}
}
pub fn decode_client_frame(bytes: &[u8]) -> Result<ClientFrame<'_>, FrameDecodeError> {
let Some(kind) = bytes.first().copied() else {
return Err(FrameDecodeError::Empty);
};
match kind {
INPUT_FRAME => Ok(ClientFrame::Input(&bytes[1..])),
RESIZE_FRAME => {
if bytes.len() != 5 {
return Err(FrameDecodeError::MalformedResize);
}
let cols = u16::from_be_bytes([bytes[1], bytes[2]]);
let rows = u16::from_be_bytes([bytes[3], bytes[4]]);
if cols == 0 || rows == 0 {
return Err(FrameDecodeError::ZeroResize);
}
Ok(ClientFrame::Resize { cols, rows })
}
other => Err(FrameDecodeError::UnknownFrameType(other)),
}
}
async fn handle_pty_socket(socket: WebSocket, config: Arc<ServerConfig>) {
let spawned = match spawn_codex_pty(config.as_ref()).await {
Ok(spawned) => spawned,
Err(err) => {
let mut socket = socket;
let message = format!("Failed to start PTY: {err}\r\n");
let _ = socket
.send(Message::Binary(message.into_bytes().into()))
.await;
let _ = socket.close().await;
return;
}
};
bridge_socket_to_pty(socket, spawned).await;
}
async fn spawn_codex_pty(config: &ServerConfig) -> anyhow::Result<SpawnedProcess> {
let command = config
.command
.to_str()
.ok_or_else(|| anyhow::anyhow!("codex web command path is not valid UTF-8"))?;
spawn_pty_process(
command,
&config.args,
&config.cwd,
&child_environment(),
/*arg0*/ &None,
DEFAULT_TERMINAL_SIZE,
)
.await
.with_context(|| format!("failed to spawn {}", config.command.display()))
}
async fn bridge_socket_to_pty(socket: WebSocket, spawned: SpawnedProcess) {
let SpawnedProcess {
session,
mut stdout_rx,
stderr_rx: _,
mut exit_rx,
} = spawned;
let session = Arc::new(session);
let writer = session.writer_sender();
let resize_session = Arc::clone(&session);
let (mut websocket_writer, mut websocket_reader) = socket.split();
let mut outbound_task: JoinHandle<()> = tokio::spawn(async move {
loop {
tokio::select! {
output = stdout_rx.recv() => {
let Some(output) = output else {
break;
};
if websocket_writer.send(Message::Binary(output.into())).await.is_err() {
break;
}
}
exit = &mut exit_rx => {
let code = exit.unwrap_or(-1);
let message = format!("\r\n[process exited: {code}]\r\n");
let _ = websocket_writer
.send(Message::Binary(message.into_bytes().into()))
.await;
let _ = websocket_writer.close().await;
break;
}
}
}
});
let mut inbound_task: JoinHandle<()> = tokio::spawn(async move {
while let Some(message) = websocket_reader.next().await {
match message {
Ok(Message::Binary(bytes)) => match decode_client_frame(&bytes) {
Ok(ClientFrame::Input(input)) => {
if writer.send(input.to_vec()).await.is_err() {
break;
}
}
Ok(ClientFrame::Resize { cols, rows }) => {
if let Err(err) = resize_session.resize(TerminalSize { rows, cols }) {
debug!("failed to resize codex web PTY: {err}");
}
}
Err(err) => {
debug!("ignoring malformed codex web frame: {err:?}");
}
},
Ok(Message::Close(_)) | Err(_) => break,
Ok(Message::Text(_) | Message::Ping(_) | Message::Pong(_)) => {}
}
}
});
tokio::select! {
_ = &mut outbound_task => {
inbound_task.abort();
}
_ = &mut inbound_task => {
outbound_task.abort();
}
}
terminate_process(&session);
}
fn terminate_process(session: &ProcessHandle) {
session.terminate();
}
fn child_environment() -> HashMap<String, String> {
let mut env: HashMap<String, String> = std::env::vars().collect();
env.insert("TERM".to_string(), "xterm-256color".to_string());
env.insert("COLORTERM".to_string(), "truecolor".to_string());
env.insert("TERM_PROGRAM".to_string(), "wterm".to_string());
env.insert(
"CODEX_TUI_DISABLE_KEYBOARD_ENHANCEMENT".to_string(),
"1".to_string(),
);
env
}
fn origin_is_allowed(headers: &HeaderMap) -> bool {
let Some(origin) = headers.get(header::ORIGIN) else {
return true;
};
let Ok(origin) = origin.to_str() else {
return false;
};
let Ok(origin) = Url::parse(origin) else {
return false;
};
let Some(host) = headers.get(header::HOST) else {
return false;
};
let Ok(host) = host.to_str() else {
return false;
};
origin[Position::BeforeHost..Position::AfterPort].eq_ignore_ascii_case(host)
}
fn http_url_for_addr(addr: SocketAddr) -> String {
match addr.ip() {
IpAddr::V4(ip) => format!("http://{ip}:{}", addr.port()),
IpAddr::V6(ip) => format!("http://[{ip}]:{}", addr.port()),
}
}
pub async fn spawn_for_test(
command: PathBuf,
args: Vec<String>,
) -> anyhow::Result<(String, oneshot::Sender<()>, JoinHandle<anyhow::Result<()>>)> {
let listener = TcpListener::bind("127.0.0.1:0").await?;
let url = http_url_for_addr(listener.local_addr()?);
let (shutdown_tx, shutdown_rx) = oneshot::channel::<()>();
let config = ServerConfig {
listen: listener.local_addr()?,
open: false,
command,
args,
cwd: std::env::current_dir()?,
};
let handle = tokio::spawn(async move {
let state = ServerState {
config: Arc::new(config),
};
let router = Router::new()
.route("/healthz", get(healthz))
.route("/api/pty", get(pty_websocket))
.fallback(get(static_handler))
.with_state(state);
axum::serve(
listener,
router.into_make_service_with_connect_info::<SocketAddr>(),
)
.with_graceful_shutdown(async move {
let _ = shutdown_rx.await;
})
.await
.context("codex web test server failed")
});
Ok((url, shutdown_tx, handle))
}
#[cfg(test)]
mod tests {
use super::*;
use pretty_assertions::assert_eq;
#[test]
fn decodes_input_frames() {
let frame = [INPUT_FRAME, b'h', b'i'];
assert_eq!(decode_client_frame(&frame), Ok(ClientFrame::Input(b"hi")));
}
#[test]
fn decodes_resize_frames() {
let frame = [RESIZE_FRAME, 0, 120, 0, 40];
assert_eq!(
decode_client_frame(&frame),
Ok(ClientFrame::Resize {
cols: 120,
rows: 40
})
);
}
#[test]
fn rejects_invalid_frames() {
assert_eq!(decode_client_frame(&[]), Err(FrameDecodeError::Empty));
assert_eq!(
decode_client_frame(&[RESIZE_FRAME, 0, 80]),
Err(FrameDecodeError::MalformedResize)
);
assert_eq!(
decode_client_frame(&[RESIZE_FRAME, 0, 0, 0, 24]),
Err(FrameDecodeError::ZeroResize)
);
assert_eq!(
decode_client_frame(&[9]),
Err(FrameDecodeError::UnknownFrameType(9))
);
}
#[test]
fn serves_index_for_root_and_spa_routes() {
let root = static_asset_for_path("/").expect("root asset");
let route = static_asset_for_path("/thread/123").expect("route asset");
assert_eq!(root.path, "index.html");
assert_eq!(route.path, "index.html");
assert_eq!(root.content_type, "text/html; charset=utf-8");
assert_eq!(root.cache_control, "no-store");
}
#[test]
fn rejects_path_traversal() {
assert!(static_asset_for_path("/../Cargo.toml").is_none());
assert!(static_asset_for_path("/assets\\index.js").is_none());
}
#[cfg(unix)]
#[tokio::test]
async fn websocket_bridges_pty_output() -> anyhow::Result<()> {
let (url, shutdown, handle) = spawn_for_test(
PathBuf::from("/bin/sh"),
vec!["-c".to_string(), "printf READY; cat".to_string()],
)
.await?;
let ws_url = format!("{url}/api/pty").replace("http://", "ws://");
let (mut socket, _) = tokio_tungstenite::connect_async(&ws_url).await?;
let mut saw_ready = false;
for _ in 0..8 {
if let Some(message) = socket.next().await {
let message = message?;
if message
.into_data()
.windows("READY".len())
.any(|w| w == b"READY")
{
saw_ready = true;
break;
}
}
}
let _ = socket.close(None).await;
let _ = shutdown.send(());
let _ = handle.await?;
assert!(saw_ready);
Ok(())
}
}

19
codex-webui/index.html Normal file
View File

@@ -0,0 +1,19 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Codex Web Terminal</title>
</head>
<body>
<main id="app" class="h-dvh bg-neutral-950 text-neutral-100">
<div id="terminal" class="h-full min-h-0"></div>
<div
id="status"
class="pointer-events-none fixed bottom-3 left-3 z-10 hidden max-w-[calc(100vw-1.5rem)] rounded border border-red-500/40 bg-neutral-950 px-3 py-2 text-sm text-red-200 shadow"
role="status"
></div>
</main>
<script type="module" src="/src/main.ts"></script>
</body>
</html>

30
codex-webui/package.json Normal file
View File

@@ -0,0 +1,30 @@
{
"name": "@openai/codex-webui",
"version": "0.0.0-dev",
"private": true,
"type": "module",
"scripts": {
"dev": "node server.mjs --dev",
"start": "node server.mjs",
"build": "vite build",
"build:rust-assets": "pnpm build && node scripts/sync-rust-assets.mjs",
"check:rust-assets": "pnpm build && node scripts/sync-rust-assets.mjs --check",
"typecheck": "tsc --noEmit",
"test": "node --test tests/*.test.mjs"
},
"dependencies": {
"@tailwindcss/vite": "4.2.4",
"@wterm/dom": "0.3.0",
"@wterm/ghostty": "0.3.0",
"node-pty": "1.2.0-beta.12",
"tailwindcss": "4.2.4",
"vite": "8.0.10",
"ws": "8.20.0"
},
"devDependencies": {
"typescript": "6.0.3"
},
"engines": {
"node": ">=22"
}
}

28
codex-webui/protocol.mjs Normal file
View File

@@ -0,0 +1,28 @@
export const INPUT_FRAME = 0x00;
export const RESIZE_FRAME = 0x01;
export function decodeClientFrame(frame) {
const data = Buffer.isBuffer(frame) ? frame : Buffer.from(frame);
if (data.length === 0) {
return { type: "invalid", reason: "empty frame" };
}
const kind = data.readUInt8(0);
if (kind === INPUT_FRAME) {
return { type: "input", data: data.subarray(1).toString("utf8") };
}
if (kind === RESIZE_FRAME) {
if (data.length !== 5) {
return { type: "invalid", reason: "malformed resize frame" };
}
const cols = data.readUInt16BE(1);
const rows = data.readUInt16BE(3);
if (cols === 0 || rows === 0) {
return { type: "invalid", reason: "resize dimensions must be positive" };
}
return { type: "resize", cols, rows };
}
return { type: "invalid", reason: `unknown frame type ${kind}` };
}

View File

@@ -0,0 +1,85 @@
#!/usr/bin/env node
import { createHash } from "node:crypto";
import {
cpSync,
existsSync,
mkdirSync,
readdirSync,
readFileSync,
rmSync,
} from "node:fs";
import { dirname, join, relative, resolve } from "node:path";
import { fileURLToPath } from "node:url";
const __filename = fileURLToPath(import.meta.url);
const packageRoot = resolve(dirname(__filename), "..");
const repoRoot = resolve(packageRoot, "..");
const distRoot = join(packageRoot, "dist");
const rustAssetsRoot = join(repoRoot, "codex-rs", "web-server", "assets");
const check = process.argv.includes("--check");
if (!existsSync(join(distRoot, "index.html"))) {
throw new Error("Missing dist/index.html. Run the codex-webui build first.");
}
if (check) {
const diff = diffTrees(distRoot, rustAssetsRoot);
if (diff.length > 0) {
throw new Error(
`codex web assets are out of date:\n${diff.map((item) => ` ${item}`).join("\n")}`,
);
}
console.log("codex web Rust assets are up to date.");
} else {
rmSync(rustAssetsRoot, { force: true, recursive: true });
mkdirSync(rustAssetsRoot, { recursive: true });
cpSync(distRoot, rustAssetsRoot, { recursive: true });
console.log(
`Synced ${relative(repoRoot, distRoot)} to ${relative(repoRoot, rustAssetsRoot)}.`,
);
}
function diffTrees(leftRoot, rightRoot) {
const leftFiles = listFiles(leftRoot);
const rightFiles = listFiles(rightRoot);
const allFiles = new Set([...leftFiles.keys(), ...rightFiles.keys()]);
const diff = [];
for (const file of [...allFiles].sort()) {
const left = leftFiles.get(file);
const right = rightFiles.get(file);
if (!left) {
diff.push(`unexpected ${file}`);
} else if (!right) {
diff.push(`missing ${file}`);
} else if (left !== right) {
diff.push(`changed ${file}`);
}
}
return diff;
}
function listFiles(root) {
const files = new Map();
if (!existsSync(root)) {
return files;
}
walk(root, root, files);
return files;
}
function walk(root, dir, files) {
for (const entry of readdirSync(dir, { withFileTypes: true })) {
const path = join(dir, entry.name);
if (entry.isDirectory()) {
walk(root, path, files);
} else if (entry.isFile()) {
files.set(relative(root, path), sha256(path));
}
}
}
function sha256(path) {
return createHash("sha256").update(readFileSync(path)).digest("hex");
}

309
codex-webui/server.mjs Normal file
View File

@@ -0,0 +1,309 @@
#!/usr/bin/env node
import { createServer } from "node:http";
import { createReadStream, existsSync } from "node:fs";
import { dirname, extname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import pty from "node-pty";
import { WebSocketServer } from "ws";
import { decodeClientFrame } from "./protocol.mjs";
const __filename = fileURLToPath(import.meta.url);
const packageRoot = dirname(__filename);
const repoRoot = resolve(packageRoot, "..");
const MIME_TYPES = {
".css": "text/css; charset=utf-8",
".html": "text/html; charset=utf-8",
".js": "text/javascript; charset=utf-8",
".json": "application/json; charset=utf-8",
".map": "application/json; charset=utf-8",
".svg": "image/svg+xml",
".wasm": "application/wasm",
};
const SERVER_FLAGS = new Set([
"--dev",
"--shell",
"--host",
"--port",
"--cwd",
"--codex-bin",
]);
export function parseArgs(argv) {
const options = {
dev: false,
host: "127.0.0.1",
port: 4321,
cwd: repoRoot,
codexBin: null,
shell: false,
codexArgs: [],
};
for (let index = 0; index < argv.length; index += 1) {
const arg = argv[index];
if (arg === "--") {
if (SERVER_FLAGS.has(argv[index + 1])) {
continue;
}
options.codexArgs = argv.slice(index + 1);
break;
}
if (arg === "--dev") {
options.dev = true;
} else if (arg === "--shell") {
options.shell = true;
} else if (arg === "--host") {
options.host = requireValue(argv, (index += 1), arg);
} else if (arg === "--port") {
options.port = Number.parseInt(requireValue(argv, (index += 1), arg), 10);
if (!Number.isFinite(options.port) || options.port <= 0) {
throw new Error("--port must be a positive integer");
}
} else if (arg === "--cwd") {
options.cwd = resolve(requireValue(argv, (index += 1), arg));
} else if (arg === "--codex-bin") {
options.codexBin = resolve(requireValue(argv, (index += 1), arg));
} else {
throw new Error(`Unknown argument: ${arg}`);
}
}
return options;
}
function requireValue(argv, index, flag) {
const value = argv[index];
if (!value) {
throw new Error(`${flag} requires a value`);
}
return value;
}
function commandFor(options) {
if (options.shell) {
return {
command: process.env.SHELL || "/bin/sh",
args: ["-l"],
};
}
if (options.codexBin) {
return {
command: options.codexBin,
args: options.codexArgs,
};
}
const debugCodex = join(
repoRoot,
"codex-rs",
"target",
"debug",
process.platform === "win32" ? "codex.exe" : "codex",
);
if (existsSync(debugCodex)) {
return {
command: debugCodex,
args: options.codexArgs,
};
}
return {
command: "cargo",
args: [
"run",
"--manifest-path",
join(repoRoot, "codex-rs", "Cargo.toml"),
"--bin",
"codex",
"--",
...options.codexArgs,
],
};
}
function createPty(options) {
const { command, args } = commandFor(options);
return pty.spawn(command, args, {
name: "xterm-256color",
cols: 80,
rows: 24,
cwd: options.cwd,
env: {
...process.env,
TERM: "xterm-256color",
COLORTERM: "truecolor",
TERM_PROGRAM: "wterm",
CODEX_TUI_DISABLE_KEYBOARD_ENHANCEMENT: "1",
},
});
}
function isAllowedOrigin(request) {
const origin = request.headers.origin;
if (!origin) {
return true;
}
try {
return new URL(origin).host === request.headers.host;
} catch {
return false;
}
}
function sendHttp(response, status, body, headers = {}) {
response.writeHead(status, {
"content-type": "text/plain; charset=utf-8",
"cache-control": "no-store",
...headers,
});
response.end(body);
}
async function serveStatic(request, response) {
const distRoot = join(packageRoot, "dist");
const url = new URL(request.url ?? "/", "http://localhost");
const pathname = decodeURIComponent(url.pathname);
const relativePath = pathname === "/" ? "index.html" : pathname.slice(1);
const resolvedPath = resolve(distRoot, relativePath);
if (!resolvedPath.startsWith(`${distRoot}/`) && resolvedPath !== distRoot) {
sendHttp(response, 403, "Forbidden");
return;
}
const filePath = existsSync(resolvedPath)
? resolvedPath
: join(distRoot, "index.html");
const contentType =
MIME_TYPES[extname(filePath)] || "application/octet-stream";
response.writeHead(200, {
"content-type": contentType,
"cache-control": filePath.endsWith("index.html")
? "no-store"
: "public, max-age=31536000, immutable",
});
createReadStream(filePath).pipe(response);
}
async function main() {
const options = parseArgs(process.argv.slice(2));
const server = createServer();
const wss = new WebSocketServer({ noServer: true });
if (options.dev) {
const { createServer: createViteServer } = await import("vite");
const vite = await createViteServer({
root: packageRoot,
server: { middlewareMode: true },
appType: "spa",
});
server.on("request", (request, response) => {
vite.middlewares(request, response, () => {
sendHttp(response, 404, "Not found");
});
});
} else {
server.on("request", (request, response) => {
if (request.url === "/healthz") {
sendHttp(response, 200, "ok");
return;
}
if (!existsSync(join(packageRoot, "dist", "index.html"))) {
sendHttp(
response,
500,
"Missing dist/. Run pnpm --filter @openai/codex-webui build first.",
);
return;
}
void serveStatic(request, response).catch((error) => {
sendHttp(
response,
500,
error instanceof Error ? error.message : String(error),
);
});
});
}
server.on("upgrade", (request, socket, head) => {
const url = new URL(request.url ?? "/", "http://localhost");
if (url.pathname !== "/api/pty" || !isAllowedOrigin(request)) {
socket.write("HTTP/1.1 403 Forbidden\r\n\r\n");
socket.destroy();
return;
}
wss.handleUpgrade(request, socket, head, (websocket) => {
wss.emit("connection", websocket, request);
});
});
wss.on("connection", (websocket) => {
let child;
try {
child = createPty(options);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
if (websocket.readyState === websocket.OPEN) {
websocket.send(Buffer.from(`Failed to start PTY: ${message}\r\n`));
websocket.close();
}
return;
}
const output = child.onData((data) => {
if (websocket.readyState === websocket.OPEN) {
websocket.send(Buffer.from(data, "utf8"));
}
});
child.onExit(({ exitCode, signal }) => {
if (websocket.readyState === websocket.OPEN) {
websocket.send(
Buffer.from(
`\r\n[process exited: ${signal ?? exitCode}]\r\n`,
"utf8",
),
);
websocket.close();
}
});
websocket.on("message", (data) => {
const frame = decodeClientFrame(data);
if (frame.type === "input") {
child.write(frame.data);
} else if (frame.type === "resize") {
child.resize(frame.cols, frame.rows);
}
});
websocket.on("close", () => {
output.dispose();
child.kill();
});
});
await new Promise((resolveListen) => {
server.listen(options.port, options.host, resolveListen);
});
const address = server.address();
const port =
typeof address === "object" && address ? address.port : options.port;
console.log(`Codex web terminal listening on http://${options.host}:${port}`);
}
if (process.argv[1] === __filename) {
main().catch((error) => {
console.error(error instanceof Error ? error.message : error);
process.exit(1);
});
}

138
codex-webui/src/main.ts Normal file
View File

@@ -0,0 +1,138 @@
import { WTerm } from "@wterm/dom";
import { GhosttyCore } from "@wterm/ghostty";
import "./styles.css";
const INPUT_FRAME = 0x00;
const RESIZE_FRAME = 0x01;
function requireElement(id: string): HTMLElement {
const element = document.getElementById(id);
if (!(element instanceof HTMLElement)) {
throw new Error(`Missing ${id} element`);
}
return element;
}
const terminalElement = requireElement("terminal");
const statusElement = requireElement("status");
const encoder = new TextEncoder();
let socket: WebSocket | null = null;
const pendingFrames: Uint8Array[] = [];
function showStatus(message: string): void {
statusElement.textContent = message;
statusElement.classList.remove("hidden");
}
function clearStatus(): void {
statusElement.textContent = "";
statusElement.classList.add("hidden");
}
function encodeInput(data: string): Uint8Array {
const encoded = encoder.encode(data);
const frame = new Uint8Array(1 + encoded.length);
frame[0] = INPUT_FRAME;
frame.set(encoded, 1);
return frame;
}
function encodeResize(cols: number, rows: number): Uint8Array {
const frame = new Uint8Array(5);
const view = new DataView(frame.buffer);
view.setUint8(0, RESIZE_FRAME);
view.setUint16(1, cols, false);
view.setUint16(3, rows, false);
return frame;
}
function sendFrame(frame: Uint8Array): void {
const payload = frame.buffer.slice(
frame.byteOffset,
frame.byteOffset + frame.byteLength,
) as ArrayBuffer;
if (socket?.readyState === WebSocket.OPEN) {
socket.send(payload);
return;
}
pendingFrames.push(new Uint8Array(payload));
}
function flushPendingFrames(): void {
while (socket?.readyState === WebSocket.OPEN && pendingFrames.length > 0) {
const frame = pendingFrames.shift();
if (frame) {
const payload = frame.buffer.slice(
frame.byteOffset,
frame.byteOffset + frame.byteLength,
) as ArrayBuffer;
socket.send(payload);
}
}
}
function websocketUrl(): string {
const protocol = window.location.protocol === "https:" ? "wss:" : "ws:";
return `${protocol}//${window.location.host}/api/pty`;
}
async function main(): Promise<void> {
const core = await GhosttyCore.load({ scrollbackLimit: 10000 });
const term = new WTerm(terminalElement, {
core,
autoResize: true,
cursorBlink: false,
onData(data) {
sendFrame(encodeInput(data));
},
onResize(cols, rows) {
sendFrame(encodeResize(cols, rows));
},
onTitle(title) {
document.title = title
? `${title} - Codex Web Terminal`
: "Codex Web Terminal";
},
});
socket = new WebSocket(websocketUrl());
socket.binaryType = "arraybuffer";
socket.addEventListener("open", () => {
clearStatus();
flushPendingFrames();
term.focus();
});
socket.addEventListener("message", (event) => {
if (event.data instanceof ArrayBuffer) {
term.write(new Uint8Array(event.data));
return;
}
if (event.data instanceof Blob) {
void event.data.arrayBuffer().then((buffer) => {
term.write(new Uint8Array(buffer));
});
return;
}
term.write(String(event.data));
});
socket.addEventListener("close", () => {
showStatus("Terminal session disconnected.");
});
socket.addEventListener("error", () => {
showStatus("Terminal connection failed.");
});
await term.init();
term.focus();
}
main().catch((error: unknown) => {
const message = error instanceof Error ? error.message : String(error);
showStatus(`Failed to start terminal: ${message}`);
});

View File

@@ -0,0 +1,19 @@
@import "tailwindcss";
@import "@wterm/dom/css";
html,
body {
margin: 0;
min-height: 100%;
}
body {
overflow: hidden;
}
#terminal {
padding: max(env(safe-area-inset-top), 0.5rem)
max(env(safe-area-inset-right), 0.5rem)
max(env(safe-area-inset-bottom), 0.5rem)
max(env(safe-area-inset-left), 0.5rem);
}

1
codex-webui/src/vite-env.d.ts vendored Normal file
View File

@@ -0,0 +1 @@
/// <reference types="vite/client" />

View File

@@ -0,0 +1,43 @@
import assert from "node:assert/strict";
import test from "node:test";
import { decodeClientFrame, INPUT_FRAME, RESIZE_FRAME } from "../protocol.mjs";
test("decodes input frames", () => {
const frame = Buffer.concat([
Buffer.from([INPUT_FRAME]),
Buffer.from("hello"),
]);
assert.deepEqual(decodeClientFrame(frame), { type: "input", data: "hello" });
});
test("decodes resize frames", () => {
const frame = Buffer.alloc(5);
frame.writeUInt8(RESIZE_FRAME, 0);
frame.writeUInt16BE(120, 1);
frame.writeUInt16BE(40, 3);
assert.deepEqual(decodeClientFrame(frame), {
type: "resize",
cols: 120,
rows: 40,
});
});
test("rejects malformed resize frames", () => {
assert.deepEqual(decodeClientFrame(Buffer.from([RESIZE_FRAME, 0, 80])), {
type: "invalid",
reason: "malformed resize frame",
});
});
test("rejects zero resize dimensions", () => {
const frame = Buffer.alloc(5);
frame.writeUInt8(RESIZE_FRAME, 0);
frame.writeUInt16BE(0, 1);
frame.writeUInt16BE(24, 3);
assert.deepEqual(decodeClientFrame(frame), {
type: "invalid",
reason: "resize dimensions must be positive",
});
});

View File

@@ -0,0 +1,16 @@
import assert from "node:assert/strict";
import test from "node:test";
import { parseArgs } from "../server.mjs";
test("parses pnpm script argument separator before server flags", () => {
const options = parseArgs(["--dev", "--", "--port", "4322", "--shell"]);
assert.equal(options.dev, true);
assert.equal(options.port, 4322);
assert.equal(options.shell, true);
assert.deepEqual(options.codexArgs, []);
});
test("uses separator before non-server flags as codex args", () => {
const options = parseArgs(["--dev", "--", "--model", "gpt-test"]);
assert.deepEqual(options.codexArgs, ["--model", "gpt-test"]);
});

16
codex-webui/tsconfig.json Normal file
View File

@@ -0,0 +1,16 @@
{
"compilerOptions": {
"target": "ES2022",
"useDefineForClassFields": true,
"module": "ESNext",
"lib": ["ES2022", "DOM", "DOM.Iterable"],
"skipLibCheck": true,
"moduleResolution": "Bundler",
"allowImportingTsExtensions": false,
"isolatedModules": true,
"moduleDetection": "force",
"noEmit": true,
"strict": true
},
"include": ["src"]
}

View File

@@ -0,0 +1,6 @@
import tailwindcss from "@tailwindcss/vite";
import { defineConfig } from "vite";
export default defineConfig({
plugins: [tailwindcss()],
});

799
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,6 @@
packages:
- codex-cli
- codex-webui
- codex-rs/responses-api-proxy/npm
- sdk/typescript
@@ -7,11 +8,15 @@ ignoredBuiltDependencies:
- esbuild
minimumReleaseAge: 10080
minimumReleaseAgeExclude: []
minimumReleaseAgeExclude:
- '@wterm/core@0.3.0'
- '@wterm/dom@0.3.0'
- '@wterm/ghostty@0.3.0'
blockExoticSubdeps: true
strictDepBuilds: true
trustPolicy: no-downgrade
trustPolicyIgnoreAfter: 10080
trustPolicyExclude: []
allowBuilds: {}
allowBuilds:
node-pty: true