How to Apply Rectified Linear Unit Function Element-Wise in PyTorch?
In this article, we are going to see How to Apply Rectified Linear Unit Function Element-Wise in PyTorch in Python. We can Rectify Linear Unit Function Element-Wise by using torch.nn.ReLU() method.
In PyTorch, torch.nn.ReLu() method replaces all the negative values with 0 and all the non-negative left unchanged. The values of the tensor must be real only. we can also do this operation in-place by using inplace=True as a Parameter. before moving further let’s see the syntax of the given method.
- inplace: This parameter is use when we want to do this operation in-place. Default value of inplace is False.
The following program is to understand how to compute the Rectified Linear Unit Function Element-Wise.
The following program is to understand how to Apply Rectified Linear Unit Function with inplace=True.