Count the number of paragraph tag using BeautifulSoup
Last Updated :
07 Apr, 2021
Sometimes, while extracting data from an HTML webpage, do you want to know how many paragraph tags are used in a given HTML document? Don’t worry we will discuss about this in this article.
Syntax:
print(len(soup.find_all("p")))
Approach:
Step 1: First, import the libraries, BeautifulSoup, and os.
from bs4 import BeautifulSoup as bs
import os
Step 2: Now, remove the last segment of the path by entering the name of the Python file in which you are currently working.
base=os.path.dirname(os.path.abspath(‘#Name of Python file in which you are currently working’))
Step 3: Then, open the HTML file from which you want to read the value.
html=open(os.path.join(base, ‘#Name of HTML file from which you wish to read value’))
Step 4: Moreover, parse the HTML file in BeautifulSoup.
soup=bs(html, 'html.parser')
Step 5: Next, print a certain line if you want to.
print("Number of paragraph tags:")
Step 6: Finally, calculate and print the number of paragraph tags in the HTML document.
print(len(soup.find_all("p")))
Implementation:
Example 1
Let us consider the simple HTML webpage, which has numerous paragraph tags.
HTML
<!DOCTYPE html>
< html >
< head >
Geeks For Geeks
</ head >
< body >
< div >
< p >King</ p >
< p >Prince</ p >
< p >Queen</ p >
</ div >
< p id = "vinayak" >Princess</ p >
</ body >
</ html >
|
For finding the number of paragraph tags in the above HTML webpage, implement the following code.
Python
from bs4 import BeautifulSoup as bs
import os
html = open ( 'gfg.html' )
soup = bs(html, 'html.parser' )
print ( "Number of paragraph tags:" )
print ( len (soup.find_all( "p" )))
|
Output:
Example 2
In the below program, we will find the number of paragraph tags on a particular website.
Python
from bs4 import BeautifulSoup as bs
import os
import requests
page = requests.get(URL)
soup = bs(page.content, 'html.parser' )
print ( "Number of paragraph tags:" )
print ( len (soup.find_all( "p" )))
|
Output:
Like Article
Suggest improvement
Share your thoughts in the comments
Please Login to comment...