GET and POST Requests Using Python
Last Updated :
12 Aug, 2024
This post discusses two HTTP (Hypertext Transfer Protocol) request methods  GET and POST requests in Python and their implementation in Python.Â
What is HTTP?Â
HTTP is a set of protocols designed to enable communication between clients and servers. It works as a request-response protocol between a client and a server. A web browser may be the client, and an application on a computer that hosts a website may be the server. So, to request a response from the server, there are mainly two methods:
- GET: To request data from the server.
- POST: To submit data to be processed to the server.
Here is a simple diagram that explains the basic concept of GET and POST methods.

Â
 Now, to make HTTP requests in Python, we can use several HTTP libraries like:
The most elegant and simplest of the above-listed libraries is Requests. We will be using the requests library in this article. To download and install the Requests library, use the following command:
pip install requests
Making a Get request
The above example finds the latitude, longitude, and formatted address of a given location by sending a GET request to the Google Maps API. An API (Application Programming Interface) enables you to access the internal features of a program in a limited fashion. And in most cases, the data provided is in JSON(JavaScript Object Notation) format (which is implemented as dictionary objects in Python!).
Python
# importing the requests library
import requests
# api-endpoint
URL = "http://maps.googleapis.com/maps/api/geocode/json"
# location given here
location = "delhi technological university"
# defining a params dict for the parameters to be sent to the API
PARAMS = {'address':location}
# sending get request and saving the response as response object
r = requests.get(url = URL, params = PARAMS)
# extracting data in json format
data = r.json()
# extracting latitude, longitude and formatted address
# of the first matching location
latitude = data['results'][0]['geometry']['location']['lat']
longitude = data['results'][0]['geometry']['location']['lng']
formatted_address = data['results'][0]['formatted_address']
# printing the output
print("Latitude:%s\nLongitude:%s\nFormatted Address:%s"
%(latitude, longitude,formatted_address))
Output:

Important points to infer:
PARAMS = {'address':location}The URL for a GET request generally carries some parameters with it. For the requests library, parameters can be defined as a dictionary. These parameters are later parsed down and added to the base URL or the API endpoint. To understand the role of the parameter, try to print r.url after the response object is created. You will see something like this:
http://maps.googleapis.com/maps/api/geocode/json?address=delhi+technological+university
This is the actual URL on which the GET request is made
r = requests.get(url = URL, params = PARAMS)
Here we create a response object ‘r’ which will store the request-response. We use requests.get() method since we are sending a GET request. The two arguments we pass are URL and the parameters dictionary.
data = r.json()
Now, in order to retrieve the data from the response object, we need to convert the raw response content into a JSON-type data structure. This is achieved by using json() method. Finally, we extract the required information by parsing down the JSON-type object.
Making a POST request
This example explains how to paste your source_code to pastebin.com by sending a POST request to the PASTEBIN API. First of all, you will need to generate an API key by signing up here and then accessing your API key here.Â
Python
# importing the requests library
import requests
# defining the api-endpoint
API_ENDPOINT = "http://pastebin.com/api/api_post.php"
# your API key here
API_KEY = "XXXXXXXXXXXXXXXXX"
# your source code here
source_code = '''
print("Hello, world!")
a = 1
b = 2
print(a + b)
'''
# data to be sent to api
data = {'api_dev_key': API_KEY,
'api_option': 'paste',
'api_paste_code': source_code,
'api_paste_format': 'python'}
# sending post request and saving response as response object
r = requests.post(url=API_ENDPOINT, data=data)
# extracting response text
pastebin_url = r.text
print("The pastebin URL is:%s" % pastebin_url)
Important features of this code:
data = {'api_dev_key':API_KEY,
'api_option':'paste',
'api_paste_code':source_code,
'api_paste_format':'python'}Here again, we will need to pass some data to the API server. We store this data as a dictionary.
r = requests.post(url = API_ENDPOINT, data = data)
Here we create a response object ‘r’ which will store the request-response. We use requests.post() method since we are sending a POST request. The two arguments we pass are the URL and the data dictionary.
pastebin_url = r.text
In response, the server processes the data sent to it and sends the pastebin_URL of your source_code which can be simply accessed by r.text.
requests.post method could be used for many other tasks as well like filling and submitting the web forms, posting on your FB timeline using the Facebook Graph API, etc.Â
Here are some important points to ponder upon:
- When the method is GET, all form data is encoded into the URL and appended to the action URL as query string parameters. With POST, form data appears within the message body of the HTTP request.
- In the GET method, the parameter data is limited to what we can stuff into the request line (URL). Safest to use less than 2K of parameters, some servers handle up to 64K.No such problem in the POST method since we send data in the message body of the HTTP request, not the URL.
- Only ASCII characters are allowed for data to be sent in the GET method. There is no such restriction in the POST method.
- GET is less secure compared to POST because the data sent is part of the URL. So, the GET method should not be used when sending passwords or other sensitive information.
GET and POST Requests Using Python – FAQs
What is GET and POST Request in Python?
In Python, GET and POST requests are ways to send data to a server or retrieve data from a server using HTTP:
- GET Request: A GET request is used to retrieve data from a server. It sends data appended in the URL and is mainly used for fetching data. It has limitations on the amount of data that can be sent because data is sent in the URL.
- POST Request: A POST request is used to send data to the server, for example, when uploading a file or submitting a completed form. Data sent via POST is transmitted in the body of the request, allowing larger amounts of data to be sent compared to a GET request.
How to Send a POST Request with Python Requests?
To send a POST request using the Python requests library, you need to use the requests.post() method. This method allows you to send data to a server or API and is useful for tasks such as submitting form data or uploading files.
import requests
url = 'http://example.com/api'
data = {'key': 'value'}
response = requests.post(url, data=data)
print(response.text)
How Do You POST a Request to a Server in Python?
Posting a request to a server in Python typically involves specifying the URL of the server, organizing the data you wish to send, and then using the requests.post() method to send the data. Here’s an example of how to send JSON data:
import requests
import json
url = 'http://example.com/api'
data = {'key': 'value'}
headers = {'Content-Type': 'application/json'}
response = requests.post(url, data=json.dumps(data), headers=headers)
print(response.text)
What Does Requests get() Do in Python?
The requests.get() function in Python is used to send a GET request to a specified url. This function retrieves data from a server at the specified URL and brings it back to the local Python environment. It’s commonly used to access web pages, download files, or consume data from APIs.
import requests
url = 'http://example.com'
response = requests.get(url)
print(response.text)
How to API Call in Python?
Making an API call in Python typically involves sending a GET or POST request to the API’s URL. The requests library is commonly used for this purpose due to its simplicity and ease of use. Here’s an example of making a GET request to an API:
import requests
# Define the API endpoint
url = 'https://api.example.com/data'
# Set any API parameters
params = {
'param1': 'value1',
'param2': 'value2'
}
# Send the GET request
response = requests.get(url, params=params)
# Check the status code and process the response
if response.status_code == 200:
data = response.json()
print(data)
else:
print("Failed to retrieve data", response.status_code)
This example demonstrates how to use requests.get() to send a GET request with parameters to an API, process the response, and handle errors effectively.
Enhance your coding skills with DSA Python, a comprehensive course focused on Data Structures and Algorithms using Python. Over 90 days, you'll explore essential algorithms, learn how to solve complex problems, and sharpen your Python programming skills. This course is perfect for anyone looking to level up their coding abilities and get ready for top tech interviews.
Join the Three 90 Challenge and commit to completing 90% of the course in 90 days to earn a 90% refund. It’s the perfect way to stay focused, track your progress, and reap the rewards of your hard work. Take the challenge and become a DSA expert in Python today
Similar Reads
Python Web Scraping Tutorial
In today’s digital world, data is the key to unlocking valuable insights, and much of this data is available on the web. But how do you gather large amounts of data from websites efficiently? That’s where Python web scraping comes in.Web scraping, the process of extracting data from websites, has em
12 min read
Introduction to Web Scraping
Introduction to Web Scraping
Web scraping is a technique to fetch data from websites. While surfing on the web, many websites prohibit the user from saving data for personal use. This article will brief you about What is Web Scraping, Uses, Techniques, Tools, and challenges of Web Scraping. Table of Content What is Web Scraping
6 min read
What is Web Scraping and How to Use It?
Suppose you want some information from a website. Let’s say a paragraph on Donald Trump! What do you do? Well, you can copy and paste the information from Wikipedia into your file. But what if you want to get large amounts of information from a website as quickly as possible? Such as large amounts o
7 min read
Web Scraping - Legal or Illegal?
If you're connected with the term 'Web Scraping' anyhow, then you must come across a question - Is Web Scraping legal or illegal? Okay, so let's discuss it. If you look closely, you will find out that in today's era the biggest asset of any business is Data! Even the top giants like Facebook, Amazon
5 min read
Difference between Web Scraping and Web Crawling
1. Web Scraping : Web Scraping is a technique used to extract a large amount of data from websites and then saving it to the local machine in the form of XML, excel or SQL. The tools used for web scraping are known as web scrapers. On the basis of the requirements given, they can extract the data fr
2 min read
Web Scraping using cURL in PHP
We all have tried getting data from a website in many ways. In this article, we will learn how to web scrape using bots to extract content and data from a website. We will use PHP cURL to scrape a web page, it looks like a typo from leaving caps lock on, but that’s really how you write it. cURL is t
2 min read
Setting Up the Environment
Extracting Data from Web Pages
Searching and Extract for specific tags Beautifulsoup
Scrapy Basics
Scrapy - Command Line Tools
Prerequisite: Implementing Web Scraping in Python with Scrapy Scrapy is a python library that is used for web scraping and searching the contents throughout the web. It uses Spiders which crawls throughout the page to find out the content specified in the selectors. Hence, it is a very handy tool to
5 min read
Scrapy - Item Loaders
In this article, we are going to discuss Item Loaders in Scrapy. Scrapy is used for extracting data, using spiders, that crawl through the website. The obtained data can also be processed, in the form, of Scrapy Items. The Item Loaders play a significant role, in parsing the data, before populating
15+ min read
Scrapy - Item Pipeline
Scrapy is a web scraping library that is used to scrape, parse and collect web data. For all these functions we are having a pipelines.py file which is used to handle scraped data through various components (known as class) which are executed sequentially. In this article, we will be learning throug
10 min read
Scrapy - Selectors
Scrapy Selectors as the name suggest are used to select some things. If we talk of CSS, then there are also selectors present that are used to select and apply CSS effects to HTML tags and text. In Scrapy we are using selectors to mention the part of the website which is to be scraped by our spiders
7 min read
Scrapy - Shell
Scrapy is a well-organized framework, used for large-scale web scraping. Using selectors, like XPath or CSS expressions, one can scrape data seamlessly. It allows systematic crawling, and scraping the data, and storing the content in different file formats. Scrapy comes equipped with a shell, that h
9 min read
Scrapy - Spiders
Scrapy is a free and open-source web-crawling framework which is written purely in python. Thus, scrapy can be installed and imported like any other python package. The name of the package is self-explanatory. It is derived from the word 'scraping' which literally means extracting desired substance
11 min read
Scrapy - Feed exports
Scrapy is a fast high-level web crawling and scraping framework written in Python used to crawl websites and extract structured data from their pages. It can be used for many purposes, from data mining to monitoring and automated testing. This article is divided into 2 sections:Creating a Simple web
5 min read
Scrapy - Link Extractors
In this article, we are going to learn about Link Extractors in scrapy. "LinkExtractor" is a class provided by scrapy to extract links from the response we get while fetching a website. They are very easy to use which we'll see in the below post. Scrapy - Link Extractors Basically using the "LinkExt
5 min read
Scrapy - Settings
Scrapy is an open-source tool built with Python Framework. It presents us with a strong and robust web crawling framework that can easily extract the info from the online page with the assistance of selectors supported by XPath. We can define the behavior of Scrapy components with the help of Scrapy
7 min read
Scrapy - Sending an E-mail
Prerequisites: Scrapy Scrapy provides its own facility for sending e-mails which is extremely easy to use, and it’s implemented using Twisted non-blocking IO, to avoid interfering with the non-blocking IO of the crawler. This article discusses how mail can be sent using scrapy. For this MailSender c
2 min read
Scrapy - Exceptions
Python-based Scrapy is a robust and adaptable web scraping platform. It provides a variety of tools for systematic, effective data extraction from websites. It helps us to automate data extraction from numerous websites. Scrapy Python Scrapy describes the spider that browses websites and gathers dat
7 min read