Skip to content
Related Articles

Related Articles

Improve Article

ML – List of Deep Learning Layers

  • Difficulty Level : Hard
  • Last Updated : 17 May, 2020

To specify the architecture of a neural network with all layers connected sequentially, create an array of layers directly. To specify the architecture of a network where layers can have multiple inputs or outputs, use a LayerGraph object. Use the following functions to create different layer types.

Input Layers:

FUNCTIONDESCRIPTION
imageInputLayer
  • Inputs images to a network
  • Applies data normalization

    .

  • sequenceInputLayer
  • Inputs sequence data to a network.
  • Learnable Layers:

    FUNCTIONDESCRIPTION
    convolution2dLayer
  • Applies sliding filters to the input.
  • It convolves the input by moving the filters along the input vertically and horizontally and computing the dot product of the weights and the input, and then adding a bias term.
  • transposedConv2dLayer
  • It upsamples feature maps.
  • fullyConnectedLayer
  • Multiplies the input by a weight matrix and then adds a bias vector

    .

  • lstmLayer
  • It is a recurrent neural network (RNN) layer that enables support for time series and sequence data in a network.
  • It performs additive interactions, which can help improve gradient flow over long sequences during training.
  • They are best suited for learning long-term dependencies

    .

  • Activation Layers:

    FUNCTIONDESCRIPTION
    reluLayer
  • It performs a threshold operation to each element of the input, where any value less than zero is set to zero.
  • leakyReluLayer
  • It performs a simple threshold operation, where any input value less than zero is multiplied by a fixed scalar
  • clippedReluLayer
  • It performs a simple threshold operation, where any input value less than zero is set to zero.
  • Any value above the clipping ceiling is set to that clipping ceiling.
  • Normalization and Dropout Layers:



    FUNCTIONDESCRIPTION
    batchNormalizationLayer
  • It normalizes each input channel across a mini-batch.
  • The layer first normalizes the activations of each channel by subtracting the mini-batch mean and dividing by the mini-batch standard deviation.
  • Then, the layer shifts the input by a learnable offset and scales it by a learnable scale factor.
  • Use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers, to speed up training of convolutional neural networks and reduce the sensitivity to network initialization.
  • crossChannelNormalizationLayer
  • It carries out channel-wise normalization.
  • dropoutLayer
  • It randomly sets input elements to zero with a given probability.
  • Pooling Layers:

    FUNCTIONDESCRIPTION
    averagePooling2dLayer
  • It performs down sampling by dividing the input into rectangular pooling regions and computing the average values of each region.
  • maxPooling2dLayer
  • It performs down sampling by dividing the input into rectangular pooling regions, and computing the maximum of each region.
  • maxUnpooling2dLayer
  • It unpools the output of a max pooling layer.
  • Combination Layers:

    FUNCTIONDESCRIPTION
    additionLayer
  • It adds multiple inputs element-wise.
  • Specify the number of inputs to the layer when you create it.
  • The inputs have names ‘in1’, ‘in2’, …, ‘inN’, where N is the number of inputs.
  • Use the input names when connecting or disconnecting the layer to other layers using connectLayers or disconnectLayers.
  • All inputs to an addition layer must have the same dimension.
  • depthConcatenationLayer
  • It takes multiple inputs that have the same height and width.
  • It concatenates them along the third dimension.
  • Output Layers:

    FUNCTIONDESCRIPTION
    softmaxLayer
  • It applies a softmax function to the input.
  • classificationLayer
  • It holds the name of the loss function the software uses for training the network for multiclass classification

    .

  • regressionLayer
  • It holds the name of the loss function the software uses for training the network for regression, and the response names.
  • Attention reader! Don’t stop learning now. Get hold of all the important Machine Learning Concepts with the Machine Learning Foundation Course at a student-friendly price and become industry ready.

    My Personal Notes arrow_drop_up
    Recommended Articles
    Page :