Skip to content
Related Articles

Related Articles

Lex code to count total number of tokens

View Discussion
Improve Article
Save Article
  • Difficulty Level : Basic
  • Last Updated : 21 May, 2019
View Discussion
Improve Article
Save Article

Lex is a computer program that generates lexical analyzers and was written by Mike Lesk and Eric Schmidt. Lex reads an input stream specifying the lexical analyzer and outputs source code implementing the lex in the C programming language.

Tokens: A token is a group of characters forming a basic atomic chunk of syntax i.e. token is a class of lexemes that matches a pattern. Eg – Keywords, identifier, operator, separator.


Input: int p=0, d=1, c=2;

total no. of tokens = 13

Below is the implementation of the above explanation:

/*Lex code to count total number of tokens */
int n = 0 ;  
// rule section
//count number of keywords
"while"|"if"|"else" {n++;printf("\t keywords : %s", yytext);}  
// count number of keywords
"int"|"float" {n++;printf("\t keywords : %s", yytext);}   
// count number of identifiers
[a-zA-Z_][a-zA-Z0-9_]* {n++;printf("\t identifier : %s", yytext);} 
// count number of operators
"<="|"=="|"="|"++"|"-"|"*"|"+" {n++;printf("\t operator : %s", yytext);}
// count number of separators
[(){}|, ;]    {n++;printf("\t separator : %s", yytext);} 
// count number of floats
[0-9]*"."[0-9]+ {n++;printf("\t float : %s", yytext);}  
// count number of integers
[0-9]+ {n++;printf("\t integer : %s", yytext);}                        
.    ;
int main() 
    printf("\n total no. of token = %d\n", n);   


My Personal Notes arrow_drop_up
Recommended Articles
Page :

Start Your Coding Journey Now!