How to Extract Weather Data from Google in Python?
Last Updated :
05 Sep, 2022
In this article, we will see how to extract weather data from google. Google does not have its own weather API, it fetches data from weather.com and shows it when you search on Google. So, we will scrape the data from Google, and also we will see another method to fetch a schematic depiction of a location’s weather data for the next two days in Python without utilizing an API.
Method 1:
Module needed:
Requests: Requests allow you to send HTTP/1.1 requests extremely easily. This module also does not come built-in with Python. To install this type the below command in the terminal.
pip install requests
bs4: Beautiful Soup is a library that makes it easy to scrape information from web pages. Whether it be an HTML or XML page, that can later be used for iterating, searching, and modifying the data within it.
Approach:
- Import the module
- Enter the city name with the URL
"https://www.google.com/search?q="+"weather"+{cityname}
- Make requests instance and pass the URL
- Get the raw data.
- Extract the required data from the soup.
- Finally, print the required data.
Step-wise implementation of code:
Step 1: Import the requests and bs4 library
Python3
import requests
from bs4 import BeautifulSoup
|
Step 2: Create a URL with the entered city name in it and pass it to the get function.
Python3
city = "lucknow"
html = requests.get(url).content
soup = BeautifulSoup(html, 'html.parser' )
|
Step 3: Soup will return a heap of data with HTML tags. So, a chunk of data has been shown below from which we will get all the necessary data with the help of the find function and passing the tag name and class name.
<div class=”kvKEAb”><div><div><div class=”BNeawe iBp4i AP7Wnd”><div><div class=”BNeawe
iBp4i AP7Wnd”>13°C</div></div></div></div></div><div><div><div class=”BNeawe tAd8D AP7Wnd”>
<div><div class=”BNeawe tAd8D AP7Wnd”>Saturday 11:10 am
Python3
temp = soup.find( 'div' , attrs = { 'class' : 'BNeawe iBp4i AP7Wnd' }).text
str = soup.find( 'div' , attrs = { 'class' : 'BNeawe tAd8D AP7Wnd' }).text
data = str .split( '\n' )
time = data[ 0 ]
sky = data[ 1 ]
|
Step 4: Here list1 contains all the div tags with a particular class name and index 5 of this list has all other required data.
Python3
listdiv = soup.findAll( 'div' , attrs = { 'class' : 'BNeawe s3v9rd AP7Wnd' })
strd = listdiv[ 5 ].text
pos = strd.find( 'Wind' )
other_data = strd[pos:]
|
Step 5: Printing all the data
Python3
print ( "Temperature is" , temp)
print ( "Time: " , time)
print ( "Sky Description: " , sky)
print (other_data)
|
Output:
Below is the full implementation:
Python3
import requests
from bs4 import BeautifulSoup
city = "lucknow"
html = requests.get(url).content
soup = BeautifulSoup(html, 'html.parser' )
temp = soup.find( 'div' , attrs = { 'class' : 'BNeawe iBp4i AP7Wnd' }).text
str = soup.find( 'div' , attrs = { 'class' : 'BNeawe tAd8D AP7Wnd' }).text
data = str .split( '\n' )
time = data[ 0 ]
sky = data[ 1 ]
listdiv = soup.findAll( 'div' , attrs = { 'class' : 'BNeawe s3v9rd AP7Wnd' })
strd = listdiv[ 5 ].text
pos = strd.find( 'Wind' )
other_data = strd[pos:]
print ( "Temperature is" , temp)
print ( "Time: " , time)
print ( "Sky Description: " , sky)
print (other_data)
|
Output:
Method 2:
Module needed:
Requests: Requests allow you to send HTTP/1.1 requests extremely easily. The HTTP request returns a response object with all of the required response data. This module also does not come built-in with Python. To install this type the below command in the terminal.
pip install requests
Approach:
- Import the requests module
- Sending request to get the IP Location Information
- Extracting the location in JSON format
- Printing the location extracted
- Passing the city name and retrieving the weather data of the city
- Printing the output
Below is the implementation:
Python3
import requests
data = res.json()
citydata = data[ 'city' ]
print (citydata)
res = requests.get(url)
print (res.text)
|
Output:
Displays the weather of the current location
Like Article
Suggest improvement
Share your thoughts in the comments
Please Login to comment...