Lexical analysis is the process of taking an input string of characters (such as the source code of a computer program) and producing a sequence of symbols called "lexical tokens", or just "tokens", which may be handled more easily by a parser.
A token, in computing, is a segment of text, regardess whether it be readable or comprised of symbols. Tokens are generally defined abstractly in a context free grammar, which is fed into a program such as yacc which checks the stream of tokens for conformity to this grammar.
A parser is a computer program or a component of a program that analyses the grammatical structure of an input, with respect to a given formal grammar, a process known as parsing. In linguistics, grammar is the set of structural rules that govern the composition of sentences, phrases, and words in any given natural language.
Related Notes :
Lexical analysis is concerned with scanning the input for significant clusters of characters called tokens. The input stream of characters is analyzed using delimiters and a careful description of the way various types tokens may be constructed.
D. Soda, G. W. Zobrist
CSC '89: Proceedings of the 17th conference on ACM Annual Computer Science Conference