PyTorch is an open-source machine learning library developed by Facebook. It is used for deep neural network and natural language processing purposes.
One of the many activation functions is the hyperbolic tangent function (also known as tanh) which is defined as .
The hyperbolic tangent function outputs in the range (-1, 1), thus mapping strongly negative inputs to negative values. Unlike the sigmoid function, only near-zero values are mapped to near-zero outputs, and this solves the “vanishing gradients” problem to some extent. The hyperbolic tangent function is differentiable at every point and its derivative comes out to be . Since the expression involves the tanh function, its value can be reused to make the backward propagation faster.
Despite the lower chances of the network getting “stuck” when compared with the sigmoid function, the hyperbolic tangent function still suffers from “vanishing gradients”. Rectified Linear Unit (ReLU) can be used to overcome this problem.
torch.tanh() provides support for the hyperbolic tangent function in PyTorch. It expects the input in radian form and the output is in the range [-∞, ∞]. The input type is tensor and if the input contains more than one element, element-wise hyperbolic tangent is computed.
Syntax: torch.tanh(x, out=None)
x: Input tensor
name (optional): Output tensor
Return type: A tensor with the same type as that of x.
1.0000 -0.5000 3.4000 -2.1000 0.0000 -6.5000 [torch.FloatTensor of size 6] 0.7616 -0.4621 0.9978 -0.9705 0.0000 -1.0000 [torch.FloatTensor of size 6]
Code #2: Visualization
-0.9999 -0.9996 -0.9984 -0.9934 -0.9728 -0.8914 -0.6134 0.0000 0.6134 0.8914 0.9728 0.9934 0.9984 0.9996 0.9999 [torch.FloatTensor of size 15]
Attention geek! Strengthen your foundations with the Python Programming Foundation Course and learn the basics.
To begin with, your interview preparations Enhance your Data Structures concepts with the Python DS Course. And to begin with your Machine Learning Journey, join the Machine Learning – Basic Level Course