Tensorflow.js tf.losses.softmaxCrossEntropy() Function
Tensorflow.js is an open-source library developed by Google for running machine learning models and deep learning neural networks in the browser or node environment.
The Tensorflow.js tf.losses.softmaxrossEntropy() function Computes the softmax cross entropy loss between two tensors and returns a new tensor.
Syntax:
tf.losses.softmaxCrossEntropy(onehotLabels,
logits, weights, labelSmoothing, reduction)
Parameters: This function accepts five parameters (in which the last three are optional) which are illustrated below:
- onehotLabels: It is a hot encoded label that has the same dimensions as predictions.
- logits: It is the predicted outputs.
- weights: These are those tensors whose rank is either 0 or 1, and they must be broad castable to loss of shape.
- labelSmoothing: If the value of this parameter is greater than 0, then it smooths the labels.
- reduction: It is the type of reduction to apply to loss. It must be of Reduction type.
Note: Parameters like weights, labelSmoothing, and reduction are optional.
Return Value: It returns a tensor having softmax cross-entropy loss between two tensors.
Javascript
import * as tf from "@tensorflow/tfjs"
const a = tf.tensor2d([[1, 4, 5], [5, 5, 7]]);
const b = tf.tensor2d([[3, 2, 5], [3, 2, 7]])
softmax_cross_entropy = tf.losses.softmaxCrossEntropy(a, b)
softmax_cross_entropy.print();
|
Output:
Tensor
30.55956268310547
Example 2: In this example, we are passing on an optional parameter that is label smoothing. If it is greater than 0, then smooth the labels.
Javascript
import * as tf from "@tensorflow/tfjs"
const a = tf.tensor2d([[1,2,3,4,5], [7,8,9,10,11]])
const b = tf.tensor2d([[6,735,8,59,10], [45,34,322,2,3]])
const c = tf.tensor2d([[4,34,34,2,4],[65,34,3,2,3]])
softmax_cross_entropy = tf.losses.softmaxCrossEntropy(a, b, 5)
softmax_cross_entropy.print();
|
Output:
Tensor
50477.5
Reference: https://js.tensorflow.org/api/latest/#losses.softmaxCrossEntropy
Last Updated :
23 Jul, 2021
Like Article
Save Article
Share your thoughts in the comments
Please Login to comment...