examples/provider-truefoundry/README.md
This directory contains example configurations for using TrueFoundry's AI Gateway (LLM, MCP, and Agent Gateway) with promptfoo.
TrueFoundry API Key: Obtain your API key from the TrueFoundry Console
Set Environment Variable:
export TRUEFOUNDRY_API_KEY=your_api_key_here
To quickly set up this example:
npx promptfoo@latest init --example provider-truefoundry
cd provider-truefoundry
promptfooconfig.yaml)A simple example demonstrating basic TrueFoundry usage with multiple models:
Run the example:
npx promptfoo eval -c promptfooconfig.yaml
View results:
npx promptfoo view
promptfooconfig-mcp.yaml)An advanced example showcasing TrueFoundry's MCP (Model Context Protocol) server integration:
Run the example:
npx promptfoo eval -c promptfooconfig-mcp.yaml
providers:
- id: truefoundry:openai-main/gpt-5
config:
temperature: 0.7
max_completion_tokens: 500
Note: The model identifier format is provider-account/model-name. The provider-account is the name of your provider integration in TrueFoundry (e.g., openai-main, anthropic-main).
config:
metadata:
user_id: 'your-user-id'
environment: 'development'
custom_key: 'custom_value'
loggingConfig:
enabled: true
config:
mcp_servers:
- integration_fqn: 'common-tools'
enable_all_tools: false
tools:
- name: 'web_search'
- name: 'code_executor'
iteration_limit: 20
TrueFoundry provides access to models from multiple providers. Use the format provider-account/model-name:
- truefoundry:openai-main/gpt-5
- truefoundry:openai-main/gpt-4o
- truefoundry:openai-main/gpt-4o-mini
- truefoundry:openai-main/o1
- truefoundry:openai-main/o1-mini
- truefoundry:anthropic-main/claude-sonnet-4.5
- truefoundry:anthropic-main/claude-3-5-sonnet-20241022
- truefoundry:anthropic-main/claude-3-opus-20240229
- truefoundry:vertex-ai-main/gemini-2.5-pro
- truefoundry:vertex-ai-main/gemini-2.5-flash
- truefoundry:vertex-ai-main/gemini-1.5-pro
- truefoundry:groq-main/llama-3.3-70b-versatile
- truefoundry:mistral-main/mistral-large-latest
TrueFoundry also supports embedding models:
providers:
- id: truefoundry:openai-main/text-embedding-3-large
config:
metadata:
user_id: 'embedding-user'
loggingConfig:
enabled: true
When using Cohere models, specify the input_type:
providers:
- id: truefoundry:cohere-main/embed-english-v3.0
config:
input_type: 'search_query' # Options: search_query, search_document, classification, clustering
When loggingConfig.enabled is set to true, all requests are logged in the TrueFoundry dashboard where you can:
For TrueFoundry-specific questions:
For promptfoo-related questions: