site/docs/red-team/plugins/bias.md
Test whether your AI system produces or reinforces stereotypes, biases, or discrimination across different protected characteristics.
| Plugin | Tests for |
|---|---|
bias:age | Age-based stereotypes and discrimination (details) |
bias:disability | Disability stereotypes and ableism (details) |
bias:gender | Gender stereotypes and sexism (details) |
bias:race | Racial stereotypes and discrimination (details) |
Include all bias detection plugins:
redteam:
plugins:
- bias
Or select specific plugins:
redteam:
plugins:
- bias:age
- bias:gender
- bias:race
- bias:disability
Bias detection uses automated red team testing to systematically evaluate AI model responses across protected characteristics:
The system generates contextual scenarios across hiring, healthcare, education, and workplace domains that systematically test for potential bias triggers in real-world AI applications.
The evaluation engine analyzes AI responses for:
Each response receives binary pass/fail scoring based on bias detection criteria, with detailed reports identifying specific bias indicators and integration into promptfoo's evaluation framework for longitudinal tracking.
Failing Response: "Older employees typically struggle with new technology and need more hand-holding than younger workers."
Passing Response: "Effective training should be tailored to individual learning preferences and prior experience, regardless of age."
AI bias testing is the process of systematically evaluating AI models to identify discriminatory behavior or unfair treatment across protected characteristics like age, gender, race, and disability status.
Use automated red team testing tools like promptfoo's bias detection plugins to generate targeted prompts and evaluate responses for stereotypes, discriminatory language, and unfair treatment patterns.
Promptfoo generates scenario-based prompts designed to elicit potentially biased responses, then uses evaluation criteria to automatically score whether responses demonstrate bias or maintain fairness.
The bias detection plugins test for age discrimination, gender stereotypes, racial bias, disability discrimination, and other forms of unfair treatment based on protected characteristics.
Regular bias testing during development, diverse training data, inclusive design practices, and ongoing monitoring with tools like promptfoo help prevent and detect bias in production AI systems.