Open In App

Python Tensorflow – tf.keras.layers.Conv1DTranspose() Function

Improve
Improve
Like Article
Like
Save
Share
Report

The tf.keras.layers.Conv1DTranspose() function is used to apply the transposed 1D convolution operation, also known as deconvolution, on data.

Syntax:tf.keras.layers.Conv1DTranspose( filters, kernel_size, strides=1, padding=’valid’, output_padding=None,   data_format=None, dilation_rate=1, activation=None, use_bias=True, kernel_initializer=’glorot_uniform’, bias_initializer=’zeros’, kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None,  **kwargs)

Input Shape: A 3D tensor of shape: (batch_size, steps, channels)

Output Shape: A 3D tensor of shape: (batch_size, new_steps, filters)

Parameters:

  • filters (Integer): The output space’s dimensionality (i.e. the number of output filters in the convolution).
  • kernel_size (Integer): The 1D convolution window’s integer length.
  • strides: The stride of the convolution along the time dimension.
  • padding: The padding mode.
  • output_padding:
  • data_format: The data format. This specifies the order in which the dimensions in the inputs are ordered. channels_last is the default value.
  • dilation_rate: In each dimension, the dilation rate to utilize for the dilated convolution. It should be an integer.
  • activation: The layer’s activation function.
  • use_bias (Boolean): If the layer has a bias vector or not. True is the default value.
  • kernel_initializer: The convolutional kernel weights matrix’s initializer.
  • bias_initializer: The bias vector’s initializer.
  • kernel_regularizer: The regularizer function applied to the kernel weights matrix.
  • bias_regularizer: The regularizer function applied to the bias vector.
  • activity_regularizer: The regularizer function applied to the activation.
  • kernel_constraint: The constraint for the convolutional kernel weights.
  • bias_constraint: The constraint for the bias vector.

Returns: A 3D tensor representing activation(conv1dtranspose(inputs, kernel) + bias).

Example 1:

Python3




import tensorflow as tf
  
tensor_shape=(4, 28, 1)
input_shape=tensor_shape[1:]
X=tf.random.normal(tensor_shape)
  
def model(input_shape):
    X_input=tf.keras.layers.Input(shape=input_shape)
    X_output=tf.keras.layers.Conv1DTranspose(filters=8
                                             kernel_size=4,
                                             strides=2)(X_input)
    model=tf.keras.models.Model(inputs=X_input, 
                                outputs=X_output)
    return model
    
model=model(input_shape)
  
Y=model.predict(X, steps=2)
print(Y.shape)


Output: 

(4, 58, 8)

Example 2:

Python3




import tensorflow as tf
  
tensor_shape = (4, 4, 1)
input_shape = tensor_shape[1:]
X = tf.random.normal(tensor_shape)
  
  
def model(input_shape):
    X_input = tf.keras.layers.Input(shape=input_shape)
    X_output = tf.keras.layers.Conv1DTranspose(
        filters=3, kernel_size=3, strides=1)(X_input)
    model = tf.keras.models.Model(inputs=X_input, outputs=X_output)
    return model
  
  
model = model(input_shape)
  
Y = model.predict(X, steps=2)
print(Y)


Output:

[[[-0.30124253 -0.36105427 -0.2042067 ]
 [ 0.02215503 -0.02281483  0.06209912]
 [ 0.00216722 -0.06402665 -0.45107672]
 [ 0.61782545  0.6981941   0.5305761 ]
 [ 0.38394764  0.49401727 -0.32046565]
 [-0.72445303 -0.70179087  0.51991314]]
[[-0.21620852 -0.25913674 -0.14656372]
 [-0.42101222 -0.5400373  -0.2516055 ]
 [ 1.1399035   1.2468109   0.51620144]
 [ 0.45842776  0.60374933 -0.43827266]
 [-0.996245   -0.97118413  0.717214  ]
 [ 0.03621851  0.03508553 -0.02599269]]
[[-0.23306094 -0.27933523 -0.15798767]
 [ 0.22609143  0.23278703  0.18968783]
 [ 0.2541324   0.2872892  -0.21050403]
 [ 0.47528732  0.6270335   0.680698  ]
 [ 0.05677184  0.1858277  -0.08888393]
 [-0.7763872  -0.75210047  0.5571844 ]]
[[ 1.2402442   1.4864949   0.8407385 ]
 [-0.580338   -0.49230838 -0.5872358 ]
 [-1.7384369  -1.8894652   0.76116455]
 [ 0.8071178   0.74401593 -0.37187982]
 [ 0.41134852  0.42184594 -0.30380705]
 [-0.13865426 -0.13431692  0.09950703]]]


Last Updated : 02 Jun, 2022
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads