examples/azure/deepseek/README.md
This example demonstrates how to use DeepSeek models on Azure AI Foundry with promptfoo, including the DeepSeek-R1 reasoning model.
You can run this example with:
npx promptfoo@latest init --example azure/deepseek
cd azure/deepseek
export AZURE_API_KEY=your-api-key
export AZURE_API_HOST=your-deployment.services.ai.azure.com
| Model | Type | Description |
|---|---|---|
DeepSeek-R1 | Reasoning | Advanced reasoning model |
DeepSeek-V3 | Chat | Standard chat model |
DeepSeek-R1-Distill-Llama-70B | Reasoning | Distilled reasoning model |
DeepSeek-R1-Distill-Qwen-32B | Reasoning | Distilled reasoning model |
DeepSeek-R1 is a reasoning model that requires special configuration:
providers:
- id: azure:chat:DeepSeek-R1
config:
isReasoningModel: true # Required for reasoning models
max_completion_tokens: 4096 # Use instead of max_tokens
reasoning_effort: medium # low, medium, or high
npx promptfoo@latest eval
npx promptfoo@latest view