Open In App

How to Convert Pytorch tensor to Numpy array?

Last Updated : 30 Jun, 2021
Improve
Improve
Like Article
Like
Save
Share
Report

In this article, we are going to convert Pytorch tensor to NumPy array.

Method 1: Using numpy().

Syntax: tensor_name.numpy()

Example 1: Converting one-dimensional a tensor to NumPy array

Python3




# importing torch module
import torch
  
# import numpy module
import numpy
  
# create one dimensional tensor with
# float type elements
b = torch.tensor([10.12, 20.56, 30.00, 40.3, 50.4])
  
print(b)
  
# convert this into numpy array using
# numpy() method
b = b.numpy()
  
# display
b


Output:

tensor([10.1200, 20.5600, 30.0000, 40.3000, 50.4000])
array([10.12, 20.56, 30.  , 40.3 , 50.4 ], dtype=float32)

Example 2: Converting two-dimensional tensors to NumPy array

Python3




# importing torch module
import torch
  
# import numpy module
import numpy
  
# create two dimensional tensor with
# integer type elements
b = torch.tensor([[1, 2, 3, 4, 5], [3, 4, 5, 6, 7], 
                  [4, 5, 6, 7, 8]])
  
print(b)
  
# convert this into numpy array using
# numpy() method
b = b.numpy()
  
# display
b


Output:

tensor([[1, 2, 3, 4, 5],
       [3, 4, 5, 6, 7],
       [4, 5, 6, 7, 8]])
array([[1, 2, 3, 4, 5],
      [3, 4, 5, 6, 7],
      [4, 5, 6, 7, 8]])

Method 2: Using numpy.array() method.

This is also used to convert a tensor into NumPy array.

Syntax: numpy.array(tensor_name)

Example: Converting two-dimensional tensor to NumPy array

Python3




# importing torch module
import torch
  
# import numpy module
import numpy
  
# create two dimensional tensor with 
# integer type elements
b = torch.tensor([[1, 2, 3, 4, 5], [3, 4, 5, 6, 7], 
                  [4, 5, 6, 7, 8]])
  
print(b)
  
# convert this into numpy array using 
# numpy.array() method
b = numpy.array(b)
  
# display
b


Output:

tensor([[1, 2, 3, 4, 5],
       [3, 4, 5, 6, 7],
       [4, 5, 6, 7, 8]])
array([[1, 2, 3, 4, 5],
      [3, 4, 5, 6, 7],
      [4, 5, 6, 7, 8]])


Like Article
Suggest improvement
Previous
Next
Share your thoughts in the comments

Similar Reads