this step is to convert stream of characters/symbols into words called tokens. These tokens
are used by the other phases of a compiler. Many tools have been developed in the past that
generate the tokenizer automatically. These tools are best suited for sequential processing.
With the advent of multi-core processors it is possible to parallelize tokenization by
exploiting the parallel constructs of the languages. In this paper we propose a parallel …