CertNova
Menu
<- Back to glossary

Tokenization

noun

Definition

  1. 1.Tokenization is the process of replacing sensitive data with unique identification symbols that retain all the essential information about the data without compromising its security.

Example

Tokenization is often used in payment processing to replace credit card numbers with a token.

Related Exams