Skip to content
Related Articles

Related Articles

Python PyTorch – backward() Function

View Discussion
Improve Article
Save Article
  • Last Updated : 09 Jun, 2022
View Discussion
Improve Article
Save Article

In this article, we are going to discuss the backward() method in Pytorch with detailed examples.

backward() Method

The backward() method in Pytorch is used to calculate the gradient during the backward pass in the neural network. If we do not call this backward() method then gradients are not calculated for the tensors. The gradient of a tensor is calculated for the one having requires_grad is set to True. We can access the gradients using .grad. If we do not call the backward() method or even for the tensors whose requires_grad is set to False, the result is None.

Syntax: tensor.backward( )

Example 1

In the below code, we define the tensors and a function and checked the gradient values without calling the backward method. So the gradient of the tensor is None.

Python3




# import necessary libraries
import torch
  
# define a tensor
A = torch.tensor(5., requires_grad=True)
print("Tensor-A:", A)
  
# define a function using above defined
# tensor
x = A**3
print("x:", x)
  
# print the gradient using .grad
print("A.grad:", A.grad)

Output

Tensor-A: tensor(5., requires_grad=True)
x: tensor(125., grad_fn=<PowBackward0>)
A.grad: None

Example 2

In this example, we defined a tensor A with requires_grad=True and created a function x using the defined tensor. Then called the backward method for the function x to compute gradient value.

Python3




# import necessary libraries
import torch
  
# define a tensor
A = torch.tensor(5., requires_grad=True)
print("Tensor-A:", A)
  
# define a function using above defined 
# tensor
x = A**3
print("x:", x)
  
# call the backward method
x.backward()
  
# print the gradient using .grad
print("A.grad:", A.grad)

Output

Tensor-A: tensor(5., requires_grad=True)
x: tensor(125., grad_fn=<PowBackward0>)
A.grad: tensor(75.)

Example 3

In this example, we are considering two tensors where one with requires_grad is set to True and the another with False and then create a function using the two defined tensors and compute the gradient values.  The tensor B returned a None value.

Python3




# import necessary libraries
import torch
  
# define two tensors
A = torch.tensor(2., requires_grad=True)
print("Tensor-A:", A)
B = torch.tensor(5., requires_grad=False)
print("Tensor-B:", B)
  
# define a function using above defined
# tensors
x = A*B
print("x:", x)
  
# call the backward method
x.backward()
  
# print the gradients using .grad
print("A.grad:", A.grad)
print("B.grad:", B.grad)

Output

Tensor-A: tensor(2., requires_grad=True)
Tensor-B: tensor(5.)
x: tensor(10., grad_fn=<MulBackward0>)
A.grad: tensor(5.)
B.grad: None

My Personal Notes arrow_drop_up
Recommended Articles
Page :

Start Your Coding Journey Now!