Tensorflow.js tf.GraphModel class .predict() Method
Last Updated :
01 Aug, 2021
Tensorflow.js is an open-source library that is developed by Google for running machine learning models as well as deep learning neural networks in the browser or node environment.
The .predict() function is used to implement the implication in favor of input tensors.
Syntax:
predict(inputs, config?)
Parameters:
- inputs: It is the stated inputs. It is of type (tf.Tensor|tf.Tensor[]|{[name: string]: tf.Tensor}).
- config: It is the stated prediction configuration in order to define the batch size as well as output node designations. Moreover, at present the batch size selection is overlooked for the graph model. It is optional and is of type object.
- batchSize: It is the stated batch dimension which is optional and is of type integer. In case its undefined, then the by default value will be 32.
- verbose: It is the stated verbosity mode whose by default value is false and is optional.
Return Value: It returns tf.Tensor|tf.Tensor[]|{[name: string]: tf.Tensor}.
Example 1: In this example, we are loading MobileNetV2 from a URL and holding a prediction with a zeros input.
Javascript
import * as tf from "@tensorflow/tfjs"
const model_Url =
const mymodel = await tf.loadGraphModel(model_Url);
const inputs = tf.zeros([1, 224, 224, 3]);
mymodel.predict(inputs).print();
|
Output:
Tensor
[[-0.1800361, -0.4059965, 0.8190175,
...,
-0.8953396, -1.0841646, 1.2912753],]
Example 2: In this example, we are loading MobileNetV2 from a TF Hub URL and holding a prediction with a zeros input.
Javascript
import * as tf from "@tensorflow/tfjs"
const model_Url =
const model = await tf.loadGraphModel(
model_Url, {fromTFHub: true });
const inputs = tf.zeros([1, 224, 224, 3]);
const batchsize = 1;
const verbose = true ;
model.predict(inputs, batchsize, verbose).print();
|
Output:
Tensor
[[-1.1690605, 0.0195426, 1.1962479,
...,
-0.4825858, -0.0055641, 1.1937635],]
Reference: https://js.tensorflow.org/api/latest/#tf.GraphModel.predict
Share your thoughts in the comments
Please Login to comment...