Open In App

Tensorflow.js tf.layers.prelu() Function

Last Updated : 26 Apr, 2022
Improve
Improve
Like Article
Like
Save
Share
Report

Tensorflow.js is a Google-developed open-source toolkit for executing machine learning models and deep learning neural networks in the browser or on the node platform. It also enables developers to create machine learning models in JavaScript and utilize them directly in the browser or with Node.js.

The tf.layers.prelu() function is used to apply the parameterized version of a leaky rectified linear unit activation function on data.

Syntax:

tf.layers.prelu(args?)

Input Shape: Arbitrary. When utilizing this layer as the initial layer in a model, use the inputShape configuration.

Output Shape: The output has the same shape as the input.

Parameters: It accepts the args object which can have the following properties:

  • args: It is an object that contains the following properties:
    • alphaInitializer:  The initializer for the learnable alpha.
    • alphaRegularizer: For the learnable alpha, this is the regularizer.
    • alphaConstraint: For the learnable alpha, this is the constraint.
    • sharedAxes: The axes along which the activation function’s learnable parameters should be shared
    • inputShape: If this property is set, it will be utilized to construct an input layer that will be inserted before this layer. 
    • batchInputShape: If this property is set, an input layer will be created and inserted before this layer. 
    • batchSize: If batchInputShape isn’t supplied and inputShape is, batchSize is utilized to build the batchInputShape.
    • dtype: It is the kind of data type for this layer. float32 is the default value. This parameter applies exclusively to input layers.
    • name: This is the layer’s name and is of string type.
    • trainable: If the weights of this layer may be changed by fit. True is the default value.
    • weights: The layer’s initial weight values.

Returns: It returns an object (PReLU).

Example 1:

Javascript




import * as tf from "@tensorflow/tfjs";
  
const pReLULayer = tf.layers.prelu({
    alphaInitializer: 'glorotUniform'
});
      
const x = tf.tensor([11, -8, -9, 12]);
  
pReLULayer.apply(x).print();


Output:

Tensor
   [11, -6.450459, -7.2567663, 12]

Example 2:

Javascript




import * as tf from "@tensorflow/tfjs";
  
const pReLULayer = tf.layers.prelu({
    alphaInitializer: 'glorotUniform'
});
      
const x = tf.tensor([1.12, -0.8, 1.9, 
    0.12, 0.25, -3.4], [2, 3]);
  
pReLULayer.apply(x).print();


Output:

Tensor
   [[1.12, 0.5329878, 1.9       ],
    [0.12, 0.25     , -3.0655782]]

Reference: https://js.tensorflow.org/api/latest/#layers.prelu



Like Article
Suggest improvement
Previous
Next
Share your thoughts in the comments

Similar Reads