Skip to content
Related Articles
Get the best out of our app
GeeksforGeeks App
Open App
geeksforgeeks
Browser
Continue

Related Articles

Tensorflow.js tf.train.adagrad() Function

Improve Article
Save Article
Like Article
Improve Article
Save Article
Like Article

Tensorflow.js is an open-source library developed by Google for running machine learning models and deep learning neural networks in the browser or node environment.

The tf.train.adagrad() function us used to create a tf.AdagradOptimizer that uses Adaptive Gradient Algorithm(adagrad). 

Syntax:

tf.train.adagrad(learningRate).

Parameters:

  • learningRate: It specifies the learning rate which will be used by adaptive gradient descent algorithm.
  • initialAccumulatorValue: It specifies the initial value of accumulators. It must be positive.

Return value: It returns a tf.adagradOptimizer.

Example 1 : Fit a function f = (x + y) by learning the coefficients x, y.

Javascript




// importing tensorflow
import tensorflow as tf
 
const xs = tf.tensor1d([0, 1, 2]);
const ys = tf.tensor1d([1.3, 2.5, 3.7]);
 
const x = tf.scalar(Math.random()).variable();
const y = tf.scalar(Math.random()).variable();
 
// Define a function f(x, y) = x + y.
const f = x => x.add(y);
const loss = (pred, label) =>
    pred.sub(label).square().mean();
 
const learningRate = 0.05;
 
// Create adagrad optimizer
const optimizer =
  tf.train.adagrad(learningRate);
 
// Train the model.
for (let i = 0; i < 5; i++) {
   optimizer.minimize(() => loss(f(xs), ys));
}
 
// Make predictions.
console.log(
`x: ${x.dataSync()}, y: ${y.dataSync()}`);
const preds = f(xs).dataSync();
preds.forEach((pred, i) => {
console.log(`x: ${i}, pred: ${pred}`);
});

Output

x: 0.8561810255050659, y: 0.6922483444213867
x: 0, pred: 0.6922483444213867
x: 1, pred: 1.6922483444213867
x: 2, pred: 2.6922483444213867

Example 2: Fit a quadratic function by learning the coefficients a, b, c.

Javascript




// importing tensorflow
import tensorflow as tf
 
const xs = tf.tensor1d([0, 1, 2, 3]);
const ys = tf.tensor1d([1.1, 5.9, 16.8, 33.9]);
 
const a = tf.scalar(Math.random()).variable();
const b = tf.scalar(Math.random()).variable();
const c = tf.scalar(Math.random()).variable();
 
const f = x => a.mul(
  x.square()).add(b.mul(x)).add(c);
const loss = (pred, label) =>
         pred.sub(label).square().mean();
 
const learningRate = 0.01;
const optimizer =
      tf.train.adagrad(learningRate);
 
// Train the model.
for (let i = 0; i < 10; i++) {
   optimizer.minimize(() => loss(f(xs), ys));
}
 
// Make predictions.
console.log(
`a: ${a.dataSync()}, b: ${b.dataSync()}, c: ${c.dataSync()}`);
const preds = f(xs).dataSync();
preds.forEach((pred, i) => {
   console.log(`x: ${i}, pred: ${pred}`);
});

Output

a: 0.3611285388469696, 
b: 0.6980878114700317, 
c: 0.8787991404533386
x: 0, pred: 0.8787991404533386
x: 1, pred: 1.9380154609680176
x: 2, pred: 3.7194888591766357
x: 3, pred: 6.223219394683838

Reference: https://js.tensorflow.org/api/1.0.0/#train.adagrad


My Personal Notes arrow_drop_up
Last Updated : 19 Jul, 2022
Like Article
Save Article
Similar Reads
Related Tutorials