Tokenization
noun
Definition
- 1.Replacing sensitive data with non-sensitive tokens while retaining usability.
Example
Payment systems may use tokenization to reduce card-data exposure.
Related Exams
noun
Definition
Example
Payment systems may use tokenization to reduce card-data exposure.
Related Exams