Open In App

List of Deep Learning Layers

Improve
Improve
Like Article
Like
Save
Share
Report

Deep learning (DL) is characterized by the use of neural networks with multiple layers to model and solve complex problems. Each layer in the neural network plays a unique role in the process of converting input data into meaningful and insightful outputs. The article explores the layers that are used to construct a neural network.

Role of Deep Learning Layers

A layer in a deep learning model serves as a fundamental building block in the model’s architecture. The structure of the network is responsible for processing and transforming input data. The flow of information through these layers is sequential, with each layer taking input from the preceding layers and passing its transformed output to the subsequent layers. This cascading process continues through the network until the final layer produces the model’s ultimate output.

The input to a layer consists of features or representations derived from the data processed by earlier layers. Each layer performs a specific computation or set of operations on this input, introducing non-linearity and abstraction to the information. The transformed output, often referred to as activations or feature maps, encapsulates higher-level representations that capture complex patterns and relationships within the data. The nature and function of each layer vary based on its type within the neural network architecture.

The nature and function of each layer vary based on its type within the neural network architecture. For instance:

  1. Dense (Fully Connected) Layer: Neurons in this layer are connected to every neuron in the previous layer, creating a dense network of connections. This layer is effective in capturing global patterns in the data.
  2. Convolutional Layer: Specialized for grid-like data, such as images, this layer employs convolution operations to detect spatial patterns and features.
  3. Recurrent Layer: Suited for sequential data, recurrent layers utilize feedback loops to consider context from previous time steps, making them suitable for tasks like natural language processing.
  4. Pooling Layer: Reduces spatial dimensions and focuses on retaining essential information, aiding in downsampling and feature selection.

MATLAB Input Layer

Layer

Description of Layer

inputLayer

Input layer receives and process data in a specialized format, serving as the initial stage for information entry into a neural network.

sequenceInputLayer

Sequence input layer receives sequential data for a neural network and incorporates the normalization of the data during the input process.

featureInputLayer

Feature Input Layer processes feature data for a neural network and integrates data normalization. This layer is suitable when dealing with a dataset consisting of numerical scalar values that represent features, without spatial or temporal dimensions.

imageInputLayer

Image input layer processes 2 dimensional images in a neural network and uses data normalization using the input stage.

image3dInputLayer

3-D image input layer receives 3-D image for a neural network.

MATLAB Fully Connected Layers

Layers

Description of Layer

fullyConnectedLayer

Fully connected layer performs matrix multiplication with a weight matrix and subsequently adding a bias vector.

MATLAB Convolution Layers

Layer

Description of Layer

convolution1dLayer

One-dimensional convolutional layer employs sliding convolutional filters on 1-D input data.

convolution2dLayer

Two-dimensional convolutional layer employs sliding convolutional filters on 2-D input data.

convolution3dLayer

Three-dimensional convolutional layer employs sliding convolutional filters on 3-D input data.

transposedConv2dLayer

Transposed two-dimensional convolutional layer increases the resolution of two-dimensional feature maps through upsampling.

transposedConv3dLayer

Transposed three-dimensional convolutional layer increases the resolution of three-dimensional feature maps through upsampling.

MATLAB Recurrent Layers

Layer

Description of Layer

lstmLayer

LSTM layer represents a type of recurrent neural network (RNN) layer specifically designed to capture and learn long-term dependencies among different time steps in time-series and sequential data.

lstmProjectedLayer

LSTM projected layer, within the realm of recurrent neural networks (RNNs), is adept at understanding and incorporating long-term dependencies among various time steps within time-series and sequential data. This is achieved through the utilization of learnable weights designed for projection.

bilstmLayer

Bidirectional LSTM (BiLSTM) layer, belonging to the family of recurrent neural networks (RNNs), is proficient in capturing long-term dependencies in both forward and backward directions among different time steps within time-series or sequential data. This bidirectional learning is valuable when the RNN needs to gather insights from the entire time series at each individual time step.

gruLayer

Gated Recurrent Unit (GRU) layer serves as a type of recurrent neural network (RNN) layer designed to capture dependencies among different time steps within time-series and sequential data.

gruProjectedLayer

A GRU projected layer, within the context of recurrent neural networks (RNNs), is specialized in understanding and incorporating dependencies among various time steps within time-series and sequential data. This is accomplished through the utilization of learnable weights designed for projection.

MATLAB Activation Layers

Layer

Description of Layer

reluLayer

ReLU conducts a threshold operation on each element of the input, setting any value that is less zero to zero.

leakyReluLayer

Leaky ReLU applies a threshold operation, where any input value that is less than zero is multiplied by a constant scalar.

clippedReluLayer

Clipped ReLU layer executes a threshold operation, setting any input value below zero to zero and capping any value surpassing the defined ceiling to that specific ceiling value.

eluLayer

Exponential Linear Unit (ELU) activation layer executes the identity operation for positive inputs and applies an exponential nonlinearity for negative inputs.

geluLayer

Gaussian Error Linear Unit (GELU) layer adjusts the input by considering its probability within a Gaussian distribution.

tanhLayer

Hyperbolic tangent (tanh) activation layer utilizes the tanh function to transform the inputs of the layer.

swishLayer

Swish activation layer employs the swish function to process the inputs of the layer.

MATLAB Pooling and Unpooling Layers

Layer

Description of Layer

averagePooling1dLayer

One dimensional average pooling layer accomplishes downsampling by segmenting the input into 1-D pooling regions and subsequently calculating the average within each region.

averagePooling2dLayer
 

Two dimensional average pooling layer conducts downsampling by partitioning the input into rectangular pooling regions and subsequently determining the average value within each region.

averagePooling3dLayer

Three dimensional average pooling layer achieves downsampling by partitioning the three-dimensional input into cuboidal pooling regions and then calculating the average values within each of these regions.

globalAveragePooling1dLayer

1-D global average pooling layer achieves downsampling by generating the average output across the time or spatial dimensions of the input.

globalAveragePooling2dLayer
 

2-D global average pooling layer accomplishes downsampling by determining the mean value across the height and width dimensions of the input.

globalAveragePooling3dLayer
 

3-D global average pooling layer achieves downsampling by calculating the mean across the height, width, and depth dimensions of the input.

maxPooling1dLayer

1-D global max pooling layer achieves downsampling by producing the maximum value across the time or spatial dimensions of the input.

maxUnpooling2dLayer

2-D max unpooling layer reverses the pooling operation on the output of a 2-D max pooling layer.

MATLAB Normalization Layer and Dropout Layer

Layer

Description of Layer

 batchNormalizationLayer

Batch normalization layer normalizes a mini-batch of data independently across all observations for each channel. To enhance the training speed of a convolutional neural network and mitigate sensitivity to network initialization, incorporate batch normalization layers between convolutional layers and non-linearities, such as ReLU layers.

groupNormalizationLayer

Group normalization layer normalizes a mini-batch of data independently across distinct subsets of channels for each observation. To expedite the training of a convolutional neural network and minimize sensitivity to network initialization, integrate group normalization layers between convolutional layers and non-linearities, such as ReLU layers.

layerNormalizationLayer

Layer normalization layer normalizes a mini-batch of data independently across all channels for each observation. To accelerate the training of recurrent and multilayer perceptron neural networks and diminish sensitivity to network initialization, incorporate layer normalization layers after the learnable layers, such as LSTM and fully connected layers.

dropoutLayer

Dropout layer randomly zeros out input elements based on a specified probability.

MATLAB Output Layers

Layer

Description of Layer

softmaxLayer

Softmax layer employs the softmax function on the input.

sigmoidLayer

Sigmoid layer utilizes a sigmoid function on the input, ensuring that the output is constrained within the range (0,1).

classificationLayer

Classification layer calculates the cross-entropy loss for tasks involving classification and weighted classification, specifically for scenarios with mutually exclusive classes.

regressionLayer

Regression layer calculates the loss using the half-mean-squared-error for tasks related to regression.


Last Updated : 31 Jan, 2024
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads