examples/integrations/openai-codex/README.md
This example shows how to use OpenAI Codex with TensorZero — fully open-source and self-hosted.
Why?
ANTHROPIC_API_KEY).Create a .env file with your provider credentials. (See .env.example for reference.)
Run docker compose up to start TensorZero.
Install Codex: npm i -g @openai/codex
Add the TensorZero Gateway to your Codex configuration (~/.config/config.yaml or ~/.codex/config.json):
model: "tensorzero::model_name::anthropic::claude-sonnet-4-5-20250929"
provider: tensorzero
providers:
tensorzero:
name: TensorZero
baseURL: http://localhost:3000/openai/v1
envKey: TENSORZERO_API_KEY # not used but required by Codex
# ... other providers ...
{
"model": "tensorzero::model_name::anthropic::claude-sonnet-4-5-20250929",
"provider": "tensorzero",
"providers": {
"tensorzero": {
"name": "TensorZero",
"baseURL": "http://localhost:3000/openai/v1",
"envKey": "TENSORZERO_API_KEY"
}
}
}
Run Codex with TensorZero:
TENSORZERO_API_KEY="not-used" codex
# or set the environment variable in your shell and just run `codex`
You can replace tensorzero::model_name::anthropic::claude-sonnet-4-5-20250929 with any other model supported by TensorZero, e.g. tensorzero::model_name::mistral::open-mistral-nemo-2407.
You can also define custom TensorZero functions in the config/tensorzero.toml file, and use them with Codex as tensorzero::function_name::your_function_name.
This will enable you to use advanced inference features, collect data for fine-tuning and other optimization recipes, and more.
See our Quick Start Guide for more details.