Tokenization

Tokenization is a process that secures important data by replacing it with unique identifiers containing essential information (but in a form that doesn’t threaten its security). Tokenization strengthens the security of sensitive data and transactions.

 

While tokenization and encryption both secure information when it’s being transmitted or stored, they’re different things. Each has strengths and weaknesses dictating which is better depending on the use case. Tokenization provides for strong security but may come with a tradeoff of lower scalability to protect large data volumes compared to encryption.

 

Seeking Clarity?

View the Cybersecurity Dictionary for top terms searched by your peers.