examples/provider-opencode-sdk/README.md
The OpenCode SDK provider runs agentic evals through OpenCode, an open-source coding agent with support for hosted and local model providers.
npx promptfoo@latest init --example provider-opencode-sdk
cd provider-opencode-sdk
curl -fsSL https://opencode.ai/install | bash
npm install @opencode-ai/sdk
Configure provider credentials in your shell or .env. For example:
export OPENAI_API_KEY=your_api_key_here
If promptfoo starts the OpenCode server for you, you can also set config.apiKey together with config.provider_id. If you use baseUrl, the target OpenCode server must already be authenticated.
If you are validating changes inside this repository, use npm run local -- eval ... from the repo root. If you initialized an example with npx promptfoo@latest init --example provider-opencode-sdk, run npx promptfoo@latest eval -c promptfooconfig.yaml --no-cache from the example directory.
The shipped examples use OpenAI for reproducible QA, but the provider itself is not OpenAI-specific. OpenCode can route to Anthropic, Google, Ollama, LM Studio, and other providers configured in OpenCode.
Chat-only usage in a temporary directory with no filesystem tools.
Location: ./basic/
Usage:
cd basic
npx promptfoo@latest eval -c promptfooconfig.yaml --no-cache
Read-only filesystem access with working_dir, using the default read, grep, glob, and list tools.
Location: ./working-dir/
Usage:
cd working-dir
npx promptfoo@latest eval -c promptfooconfig.yaml --no-cache
Provider-enforced JSON Schema output using the OpenCode format request option.
Location: ./structured-output/
Usage:
cd structured-output
npx promptfoo@latest eval -c promptfooconfig.yaml --no-cache
Session-level permission rules mixing simple and pattern-based entries. Promptfoo converts the object config into the PermissionRuleset array the OpenCode v2 API expects.
Location: ./permissions/
Usage:
cd permissions
npx promptfoo@latest eval -c promptfooconfig.yaml --no-cache
The provider supports these high-value options:
providers:
- id: opencode:sdk
config:
provider_id: openai
model: gpt-4o-mini
# Optional: pass credentials directly when promptfoo starts the server
apiKey: '{{env.OPENAI_API_KEY}}'
# Read-only local repo access
working_dir: ./examples/provider-opencode-sdk/basic
workspace: feature-branch
# Structured output
format:
type: json_schema
schema:
type: object
properties:
answer:
type: string
required: [answer]
# Reuse the same OpenCode session for repeated calls
persist_sessions: true
OpenCode can route requests to the provider ecosystem it already supports, including: