RhymeZone

 

Definitions of tokenizer:
  • noun:   (computing) A system that parses an input stream into its component tokens.

(Definitions from Wiktionary)

Related words...
Descriptive words...


 
Help  Feedback  Privacy  Terms of Use

Copyright © 2023