substitute a randomly generated identifier for (a sensitive piece of data) in order to prevent unauthorized access
sensitive data has been tokenized or strongly encrypted
tokenized payment systems
break (text) into individual linguistic units
our text gets tokenized into terms
check if words are tokenized and normalized correctly
treat (a member of a minority group) as if they have been chosen by way of tokenism
there was immense diversity and no one was tokenized