CertNova
Menu
<- Back to glossary

Adversarial Risk Mitigation

noun

Definition

  1. 1.Adversarial risk mitigation refers to strategies and techniques used to protect AI models from attacks that manipulate input data to produce incorrect outputs.

Example

Implementing robust training techniques can help mitigate risks from adversarial attacks.

Related Exams