Lex is a computer program that generates lexical analyzers and was written by Mike Lesk and Eric Schmidt. Lex reads an input stream specifying the lexical analyzer and outputs source code implementing the lex in the C programming language.
Tokens: A token is a group of characters forming a basic atomic chunk of syntax i.e. token is a class of lexemes that matches a pattern. Eg – Keywords, identifier, operator, separator.
Input: int p=0, d=1, c=2; Output: total no. of tokens = 13
Below is the implementation of the above explanation:
- DFA in LEX code which accepts even number of zeros and even number of ones
- DFA in LEX code which accepts Odd number of 0’s and even number of 1’s
- C program to detect tokens in a C program
- Lex Program to count number of words
- LEX program to count the number of vowels and consonants in a given string
- Lex program to count the number of lines, spaces and tabs
- Three address code in Compiler
- Lex Program to print the total characters, white spaces, tabs in the given input file
- DFA in LEX code which accepts strings ending with 11
- Issues in the design of a code generator
- Compiler Design | Code Optimization
- Code Optimization | Frequency Reduction
- LEX code to extract HTML tags from a file
- Compiler Design | Introduction of Object Code
- Compiler Design | Intermediate Code Generation
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.
Improved By : Ravimaurya2