TAAFT
Free mode
100% free
Freemium
Free Trial
Deals

Red Teaming

[rɛd ˈtiːmɪŋ]
Ethics & Safety
Last updated: December 9, 2024

Definition

Systematic stress testing of AI systems to identify vulnerabilities

Detailed Explanation

Practice of deliberately challenging AI systems to discover potential weaknesses biases or harmful behaviors before deployment

Use Cases

Security vulnerability testing Bias detection exercises Safety limit testing

Related Terms