Open In App

Tensorflow.js tf.train.Optimizer class .computeGradients() Method

Tensorflow.js is an open-source library developed by Google for running machine learning models and deep learning neural networks in the browser or node environment.

Executes f() and computes the gradient of the scalar output of f() with respect to the list of trainable variables provided by varList. If no list is provided, it defaults to all trainable variables.



Syntax: 

Optimizer.computeGradients(f, varList?);

Parameters:



Returns: { value : tf.Scalar, grads : { [ name : string ] : tf.Tensor } }

Example 1: 




// Importing tensorflow
import * as tf from "@tensorflow/tfjs"
     
const xs = tf.tensor1d([3, 4, 5]);
const ys = tf.tensor1d([3.5, 4.7, 5.3]);
     
const x = tf.scalar(Math.random()).variable();
const y = tf.scalar(Math.random()).variable();
     
// Define a function f(x, y) = ( x^2 ) -  y.
const f = x => (x.square()).sub(y);
const loss = (pred, label) =>
    pred.sub(label).square().mean();
     
const learningRate = 0.05;
     
// Create adam optimizer
const optimizer =
tf.train.adam(learningRate);
     
// Train the model.
for (let i = 0; i < 6; i++) {
optimizer.computeGradients(() => loss(f(xs), ys));
}
     
// Make predictions.
console.log(
`x: ${x.dataSync()}, y: ${y.dataSync()}`);
const preds = f(xs).dataSync();
preds.forEach((pred, i) => {
console.log(`x: ${i}, pred: ${pred}`);
});

Output:

x: 0.38272422552108765, y: 0.7651948928833008
x: 0, pred: 8.2348051071167
x: 1, pred: 15.2348051071167
x: 2, pred: 24.234806060791016

 Example 2:  




// Importing tensorflow
import * as tf from "@tensorflow/tfjs"
     
const xs = tf.tensor1d([0, 1, 2, 3]);
const ys = tf.tensor1d([1.3, 3.7, 12.4, 26.6]);
     
// Choosing random coefficients
const a = tf.scalar(Math.random()).variable();
const b = tf.scalar(Math.random()).variable();
const c = tf.scalar(Math.random()).variable();
     
// Defining function f = (a*x^2 + b*x + c)
const f = x => a.mul(x.mul(3)).add(b.square(x)).add(c);
const loss = (pred, label) => pred.sub(label).square().mean();
     
// Setting configurations for our optimizer
const learningRate = 0.01;
const initialAccumulatorValue = 10;
 
     
// Create the Optimizer
const optimizer = tf.train.adagrad(learningRate,
        initialAccumulatorValue);
     
// Train the model.
for (let i = 0; i < 5; i++) {
optimizer.computeGradients(() => loss(f(xs), ys));
}
     
// Make predictions.
console.log(`a: ${a.dataSync()},
    b: ${b.dataSync()}, c: ${c.dataSync()}`);
const preds = f(xs).dataSync();
preds.forEach((pred, i) => {
console.log(`x: ${i}, pred: ${pred}`);
});

 Output: 

a: 0.22211307287216187,
b: 0.2304522693157196,
c: 0.42621928453445435
x: 0, pred: 0.479327529668808
x: 1, pred: 1.1456668376922607
x: 2, pred: 1.8120059967041016
x: 3, pred: 2.4783451557159424

Reference:https://js.tensorflow.org/api/latest/#tf.train.Optimizer.computeGradients 


Article Tags :