Open In App

Linear mapping

Improve
Improve
Like Article
Like
Save
Share
Report

Linear mapping is a mathematical operation that transforms a set of input values into a set of output values using a linear function. In machine learning, linear mapping is often used as a preprocessing step to transform the input data into a more suitable format for analysis. Linear mapping can also be used as a model in itself, such as in linear regression or linear classifiers.

The linear mapping function can be represented as follows:

y = Wx + b

where x is the input vector, W is the weight matrix, b is the bias vector, and y is the output vector. The weight matrix and bias vector are learned during the training process.

  1. Linear mapping has several advantages in machine learning:
  2. Simplicity: Linear mapping is a simple and easy-to-understand mathematical operation, making it an attractive choice for many machine learning tasks.
  3. Speed: Linear mapping is a computationally efficient operation, making it suitable for large datasets and real-time applications.
  4. Interpretability: Linear mapping is a transparent and interpretable operation, making it easier to understand and analyze the results of a model.
  5. Versatility: Linear mapping can be applied to a wide range of machine-learning tasks, including regression, classification, and clustering.
  6. However, there are also some limitations of linear mapping:
  7. Limited expressiveness: Linear mapping can only model linear relationships between variables, which may not be sufficient for complex tasks that require non-linear relationships.
  8. Sensitivity to outliers: Linear mapping is sensitive to outliers in the data, which can lead to poor model performance.
  9. Limited feature engineering: Linear mapping may not be able to capture complex interactions between features, which can limit its ability to extract meaningful information from the data.

Overall, linear mapping is a powerful and versatile tool in machine learning, but it should be used with care and in combination with other techniques to address its limitations.

Linear mapping 

Let V and W are the vector spaces over field K. A function f: V-> W is said to be the linear map for two vector v,u \\epsilon V            and a scalar c \\epsilon            K:

  • If the transformation is additive in nature:
f(u + v) = f(u) + f(v)
  • If they are multiplicative in nature in terms of a scalar.
f(cu) = c \cdot f(u)

Zero/Identity Transformation

A  linear transformation T: V \rightarrow V            from a vector space into itself is called a Linear operator:

  • Zero-Transformation: For a transformation T: V \rightarrow W            is called zero-transformation if:

T(v) = 0 \, \forall \, V

  • Identity-Transformation: For a transformation T: V \rightarrow V            is called identity-transformation if:

T(v) =v \, \forall \, V

Properties of Linear Transformation

Let T: V \rightarrow W be the linear transformation where u,v \epsilon V. Then, the following properties are true:

T(0) =0
T(-v) = - T(v)
T(u-v) = T(u) - T(v)
If v = c_1 v_1 + c_2 v_2 + ... + c_n v_n            
 
then, 
T(v) = c_1 T(v_1) + c_2 T(v_2) + ... + c_n T(v_n)

Linear Transformation of Matrix

Let T be a mxn matrix, the transformation T:R^n \rightarrow R^m            is  linear transformation if:

T(v) = Av

Zero and Identity Matrix operations

  • A matrix mxn matrix is a zero matrix, corresponding to zero transformation from R^n \rightarrow R^m.
  • A matrix nxn matrix is Identity matrix \mathbb{I_n}           , corresponds to zero transformation from R^n \rightarrow R^m           .

A \cdot R^m  = R^n \\ \begin{bmatrix} a_{11}&  a_{12}&  .&  .&  .& a_{1n} \\ a_{21}&  a_{22}&  .&  .&  .&a_{2n} \\ .&  .&  .&  &  & .\\ .&  .&  &  .&  & .\\ .&  .&  &  &  .& .\\ a_{m1}&  a_{m2}&  .&  .&  .&a_{mn} \end{bmatrix} \cdot \begin{bmatrix} v_1\\ v_2\\ .\\ .\\ .\\ v_n \end{bmatrix} = \begin{bmatrix} a_{11} v_1 + a_{12} v_2 \, .\, \, . a_{1n} v_n \\ .\\ .\\ .\\ .\\ a_{m1} v_1 + a_{m2} v_2 \, .\, \, . a_{mn} v_n \\ \end{bmatrix}

Example

Let’s consider the linear transformation from R^{2} \rightarrow R^3 such that:

L(\begin{bmatrix} v_1\\ v_2 \end{bmatrix})= \begin{bmatrix} v_2\\ v_1 - v_2 \\ v_1 + v_2 \end{bmatrix}

Now, we will be verifying that it is a linear transformation. For that we need to check for the above two conditions for the Linear mapping, first, we will be checking the constant multiplicative conditions:

L(c \vec{v}) = c \cdot L(\vec{v})

L(c\begin{bmatrix} v_1\\ v_2 \end{bmatrix})= \begin{bmatrix} c v_1\\ c v_1 - c v_2 \\ c v_1 + c v_2 \end{bmatrix}= c \begin{bmatrix} v_1\\ v_1 - v_2 \\ v_1 + v_2 \end{bmatrix} = c L(\vec{v})

and the following transformation:

L(\vec{v} + \vec{w})= L(\vec{v}) + L(\vec{w})

\vec{v} =\begin{bmatrix} v_1 \\ v_2 \end{bmatrix} \vec{w} =\begin{bmatrix} w_1 \\ w_2 \end{bmatrix} \vec{v} + \vec{w} =\begin{bmatrix} v_1 + w_1\\ v_2 + w_2 \end{bmatrix}

L(\vec{v} + \vec{w}) = \begin{bmatrix} v_1 + w_1\\ (v_1 + w_1) - (v_2 + w_2)\\ (v_1 + w_1) + (v_2 + w_2) \end{bmatrix}=\begin{bmatrix} v_1 + w_1\\ (v_1 + v_2) - (w_1 + w_2)\\ (v_1 + v_2) + (w_1 + w_2) \end{bmatrix} = \begin{bmatrix} v_1\\ (v_1 - v_2)\\ (v_1 + v_2) \end{bmatrix} + \begin{bmatrix} w_1\\ (w_1 - w_2)\\ (w_1 + w_2) \end{bmatrix} = L(\vec{v}) + L(\vec{w})

It proves that the above transformation is Linear transformation. Examples of not linear transformation include trigonometric transformation, polynomial transformations.

Kernel/ Range Space:

Kernel space: 

Let T: V \rightarrow W is linear transformation then \forall v \epsilon V such that:

T \cdot v =0

is the kernel space of T. It is also known as the null space of T.

  • The kernel space of zero transformation for T:V \rightarrow W is W.
  • The kernel space of identity transformation for T:V \rightarrow W is {0}.

The dimensions of the kernel space are known as nullity or null(T).

Range Space: Let T: V \rightarrow W is linear transformation then \forall v \epsilon V such that:

T \cdot v = v

is the range space of T. Range space is always a non-empty set for a linear transformation on a matrix because:

T \cdot  0 =0

The dimensions of the range space are known as rank (T). The sum of rank and nullity is the dimension of the domain:

null(T) + rank(T) = dim(V)=n

Linear Transformation as Rotation

Some of the transformation operators when applied to some vector give the output of vector with rotation with angle \theta of the original vector. 

The linear transformation T: R^2 \rightarrow R^2 given by matrix: A= \begin{bmatrix} cos\theta & -sin \theta \\ sin\theta & cos \theta \end{bmatrix}           has the property that it rotates every vector in anti-clockwise about the origin wrt angle \theta:

Let v=\begin{bmatrix} r \,  cos \alpha\\ r \, sin \alpha \end{bmatrix}

T(v) = A \cdot v= \begin{bmatrix} cos\theta & -sin \theta \\ sin\theta & cos \theta \end{bmatrix} \cdot \begin{bmatrix} r \, cos \alpha \\ r \, sin \alpha \end{bmatrix} = \begin{bmatrix} r \cdot(\, cos \theta \, cos \alpha - sin \theta \, sin \alpha) \\ r \cdot (\, sin \theta \, cos \alpha + cos \theta \, sin \alpha) \end{bmatrix} = \begin{bmatrix} r \, cos(\theta + \alpha) \\ r \, sin(\theta + \alpha)   \end{bmatrix}

which is similar to rotating the original vector by \theta.

Linear Transformation as Projection

A linear transformation T: R^3 \rightarrow R^3 is given by:

T = \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 0 \end{bmatrix}

If a vector is given by v = (x, y, z). Then, T\cdot v  = (x, y, 0). That is the orthogonal projection of the original vector.

Differentiation as Linear Transformation

Let T: P(F) \rightarrow P(F)   be the differentiation transformation such that: T \cdot p(z) = p^{'}(z).   Then for two polynomials p(z), q(z) \epsilon P(F)  , we have:

T(p(z) + q(z)) = (p(z) + q(z))^{'} = p^{'}(z) + q^{'}(z) = T(p(z)) + T(q(z))

Similarly, for the scalar a \epsilon F we have:

T(a\cdot p(z)) = (a \cdot p(z))^{'} = a p^{'}(z) = a T(p(z))

The above equation proved that differentiation is a linear transformation.



Last Updated : 10 Mar, 2023
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads