HomeGlossaryTokenization

Tokenization

Tokenization

Tokenization is the process of substituting a sensitive data element with a non-sensitive equivalent, known as a token, that has no extrinsic or exploitable meaning or value. Tokenization is used to safeguard sensitive information such as credit card numbers, reducing the risk of data breaches while maintaining the utility of the data for processing and analytics.