Skip to content
Related Articles

Related Articles

Improve Article

Linear mapping

  • Last Updated : 08 Oct, 2021
Geek Week

Linear mapping 

Let V and W are the vector spaces over field K. A function f: V-> W is said to be the linear map for two vector v,u \\epsilon V          and a scalar c \\epsilon          K:

  • If the transformation is additive in nature:

f(u + v) = f(u) + f(v)

  • If they are multiplicative in nature in terms of a scalar.

f(cu) = c \cdot f(u)

Zero/Identity Transformation

A  linear transformation T: V \rightarrow V          from a vector space into itself is called Linear operator:

  • Zero-Transformation: For a transformation T: V \rightarrow W          is called zero-transformation if:

T(v) = 0 \, \forall \, V

  • Identity-Transformation: For a transformation T: V \rightarrow V          is called identity-transformation if:

T(v) =v \, \forall \, V

Properties of Linear Transformation

Let T: V \rightarrow W be the linear transformation where u,v \epsilon V. Then, the following properties are true:

  • T(0) =0
  • T(-v) = - T(v)
  • T(u-v) = T(u) - T(v)
  • If v = c_1 v_1 + c_2 v_2 + ... + c_n v_n          then, T(v) = c_1 T(v_1) + c_2 T(v_2) + ... + c_n T(v_n)

Linear Transformation of Matrix

Let T be a mxn matrix, the transformation T:R^n \rightarrow R^m          is  linear transformation if:

T(v) = Av

Zero and Identity Matrix operations

  • A matrix mxn matrix is a zero matrix, corresponds to zero transformation from R^n \rightarrow R^m.
  • A matrix nxn matrix is Identity matrix \mathbb{I_n}         , corresponds to zero transformation from R^n \rightarrow R^m         .

A \cdot R^m  = R^n \\ \begin{bmatrix} a_{11}&  a_{12}&  .&  .&  .& a_{1n} \\ a_{21}&  a_{22}&  .&  .&  .&a_{2n} \\ .&  .&  .&  &  & .\\ .&  .&  &  .&  & .\\ .&  .&  &  &  .& .\\ a_{m1}&  a_{m2}&  .&  .&  .&a_{mn} \end{bmatrix} \cdot \begin{bmatrix} v_1\\ v_2\\ .\\ .\\ .\\ v_n \end{bmatrix} = \begin{bmatrix} a_{11} v_1 + a_{12} v_2 \, .\, \, . a_{1n} v_n \\ .\\ .\\ .\\ .\\ a_{m1} v_1 + a_{m2} v_2 \, .\, \, . a_{mn} v_n \\ \end{bmatrix}


Let’s consider the linear transformation from R^{2} \rightarrow R^3 such that:

L(\begin{bmatrix} v_1\\ v_2 \end{bmatrix})= \begin{bmatrix} v_2\\ v_1 - v_2 \\ v_1 + v_2 \end{bmatrix}

Now, we will be verifying that it is a linear transformation. For that we need to check for the above two conditions for the Linear mapping, first, we will be checking the constant multiplicative conditions:

L(c \vec{v}) = c \cdot L(\vec{v})

L(c\begin{bmatrix} v_1\\ v_2 \end{bmatrix})= \begin{bmatrix} c v_1\\ c v_1 - c v_2 \\ c v_1 + c v_2 \end{bmatrix}= c \begin{bmatrix} v_1\\ v_1 - v_2 \\ v_1 + v_2 \end{bmatrix} = c L(\vec{v})

and the following transformation:

L(\vec{v} + \vec{w})= L(\vec{v}) + L(\vec{w})

\vec{v} =\begin{bmatrix} v_1 \\ v_2 \end{bmatrix} \vec{w} =\begin{bmatrix} w_1 \\ w_2 \end{bmatrix} \vec{v} + \vec{w} =\begin{bmatrix} v_1 + w_1\\ v_2 + w_2 \end{bmatrix}

L(\vec{v} + \vec{w}) = \begin{bmatrix} v_1 + w_1\\ (v_1 + w_1) - (v_2 + w_2)\\ (v_1 + w_1) + (v_2 + w_2) \end{bmatrix}=\begin{bmatrix} v_1 + w_1\\ (v_1 + v_2) - (w_1 + w_2)\\ (v_1 + v_2) + (w_1 + w_2) \end{bmatrix} = \begin{bmatrix} v_1\\ (v_1 - v_2)\\ (v_1 + v_2) \end{bmatrix} + \begin{bmatrix} w_1\\ (w_1 - w_2)\\ (w_1 + w_2) \end{bmatrix} = L(\vec{v}) + L(\vec{w})

It proves that the above transformation is Linear transformation. Examples of not linear transformation include trigonometric transformation, polynomial transformations.

Kernel/ Range Space:

Kernel space: 

Let T: V \rightarrow W is linear transformation then \forall v \epsilon V such that:

T \cdot v =0

is the kernel space of T. It is also known as null space of T.

  • The kernel space of zero transformation for T:V \rightarrow W is W.
  • The kernel space of identity transformation for T:V \rightarrow W is {0}.

The dimensions of the kernel space is known as nullity or null(T).

Range Space:

Let T: V \rightarrow W is linear transformation then \forall v \epsilon V such that:

T \cdot v = v

is the range space of T. Range space is always non-empty set for a linear transformation on matrix because:

T \cdot  0 =0

The dimensions of the range space is known as rank (T). The sum of rank and nullity is the dimension of the domain:

null(T) + rank(T) = dim(V)=n

Linear Transformation as Rotation

Some of the transformation operators when applied to some vector give the output of vector with rotation with angle \theta of the original vector. 

  • The linear transformation T: R^2 \rightarrow R^2 given by matrix: A= \begin{bmatrix} cos\theta & -sin \theta \\ sin\theta & cos \theta \end{bmatrix}         has the property that it rotates every vector in anti-clockwise about the origin wrt angle \theta:

Let v=\begin{bmatrix} r \,  cos \alpha\\ r \, sin \alpha \end{bmatrix}

T(v) = A \cdot v= \begin{bmatrix} cos\theta & -sin \theta \\ sin\theta & cos \theta \end{bmatrix} \cdot \begin{bmatrix} r \, cos \alpha \\ r \, sin \alpha \end{bmatrix} = \begin{bmatrix} r \cdot(\, cos \theta \, cos \alpha - sin \theta \, sin \alpha) \\ r \cdot (\, sin \theta \, cos \alpha + cos \theta \, sin \alpha) \end{bmatrix} = \begin{bmatrix} r \, cos(\theta + \alpha) \\ r \, sin(\theta + \alpha)   \end{bmatrix}

which is similar to rotating the original vector by \theta.

Linear Transformation as Projection

A linear transformation T: R^3 \rightarrow R^3 is given by:

T = \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 0 \end{bmatrix}

If a vector is given by v = (x, y, z) . Then, T\cdot v  = (x, y, 0). That is the orthogonal projection of original vector.

Differentiation as Linear Transformation

Let T: P(F) \rightarrow P(F) be the differentiation transformation such that: T \cdot p(z) = p^{'}(z). Then for two polynomials p(z), q(z) \epsilon P(F), we have:

T(p(z) + q(z)) = (p(z) + q(z))^{'} = p^{'}(z) + q^{'}(z) = T(p(z)) + T(q(z))

Similarly, for the scalar a \epsilon F we have:

T(a\cdot p(z)) = (a \cdot p(z))^{'} = a p^{'}(z) = a T(p(z))

The above equation proved that differentiation is linear transformation.


Attention reader! Don’t stop learning now. Get hold of all the important Machine Learning Concepts with the Machine Learning Foundation Course at a student-friendly price and become industry ready.

My Personal Notes arrow_drop_up
Recommended Articles
Page :