Tokenization replaces sensitive data with randomly generated surrogate values (tokens) that reveal nothing about the original data. Unlike encryption, there is no mathematical relationship between token and original — the mapping exists only in a secured vault database. Common use cases include credit card numbers and personally identifiable information. In an ISMS, tokenization is an effective data minimization measure. It significantly reduces the volume of sensitive data in downstream systems.