Lex is a computer program that generates lexical analyzers and was written by Mike Lesk and Eric Schmidt. Lex reads an input stream specifying the lexical analyzer and outputs source code implementing the lex in the C programming language.
Tokens: A token is a group of characters forming a basic atomic chunk of syntax i.e. token is a class of lexemes that matches a pattern. Eg – Keywords, identifier, operator, separator.
Input: int p=0, d=1, c=2; Output: total no. of tokens = 13
Below is the implementation of the above explanation:
- DFA in LEX code which accepts even number of zeros and even number of ones
- C program to detect tokens in a C program
- Lex Program to count number of words
- LEX program to count the number of vowels and consonants in a given string
- Lex program to count the number of lines, spaces and tabs
- Three address code in Compiler
- Lex Program to print the total characters, white spaces, tabs in the given input file
- Compiler Design | Code Optimization
- Issues in the design of a code generator
- Code Optimization | Frequency Reduction
- Compiler Design | Intermediate Code Generation
- Compiler Design | Introduction of Object Code
- LEX code to extract HTML tags from a file
- Compiler Design | Detection of a Loop in Three Address Code
- Lex code to replace a word with another word in a file
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.