Prompt Injection Attacks
noun
Definition
- 1.Prompt injection attacks involve maliciously altering inputs to manipulate AI model outputs, potentially leading to incorrect or harmful results.
Example
An attacker might inject misleading information into a prompt to skew the output of a language model.
Related Exams

