Look up tokenization or tokenisation in Wiktionary, the free dictionary.
Tokenization may refer to:
- Tokenization (lexical analysis) in language processing
- Tokenization in search engine indexing
- Tokenization (data security) in the field of data security
- Word segmentation
See also
edit- Tokenism of minorities