examples/redteam-azure-assistant/README.md
Red team testing for Azure OpenAI Assistants with function tools to evaluate security boundaries.
You can run this example with:
npx promptfoo@latest init --example redteam-azure-assistant
cd redteam-azure-assistant
This example demonstrates how to security test an Azure OpenAI Assistant that has access to sensitive HR data through function tools. It includes mock HR database functions and a configured red team setup.
Setup environment variables:
AZURE_API_KEY=your_key
AZURE_OPENAI_API_HOST=your-resource.openai.azure.com
AZURE_DEPLOYMENT_NAME=your_deployment_name
AZURE_CHAT_DEPLOYMENT_NAME=your_eval_deployment
Update configuration:
promptfooconfig.yamlazure:assistant:asst_V3UgNCNUSAtHQdit8YimCKlJ with your Assistant IDapiHost to your Azure endpointGenerate security tests:
npx promptfoo@latest redteam generate
Run the security evaluation:
npx promptfoo@latest redteam eval
View results:
npx promptfoo@latest redteam report
# or
npx promptfoo@latest view
This example tests if an HR assistant can be manipulated to:
callbacks/hr-database.jspromptfooconfig.yamlThis example contains simulated sensitive data for educational purposes. Use responsibly and only test systems you are authorized to evaluate.