**Trace of a matrix :**

Let A=[a_{ij}] nxn is a square matrix of order n, then the sum of diagonal elements is called the trace of a matrix which is denoted by tr(A). tr(A) = a_{11} + a_{22} + a_{33}+ ……….+ a_{nn}

**Properties of trace of matrix:**

Let A and B be any two square matrix of order n, then

- tr(kA) = k tr(A) where k is a scalar.
- tr(A+B) = tr(A)+tr(B)
- tr(A-B) = tr(A)-tr(B)
- tr(AB) = tr(BA)

**Solution of a system of linear equations:**

Linear equations can have three kind of possible solutions:

- No Solution
- Unique Solution
- Infinite Solution

**Rank of a matrix:** Rank of matrix is the number of non-zero rows in the row reduced form or the maximum number of independent rows or the maximum number of independent columns.

Let A be any mxn matrix and it has square sub-matrices of different orders. A matrix is said to be of rank r, if it satisfies the following properties:

- It has at least one square sub-matrices of order r who has non-zero determinant.
- All the determinants of square sub-matrices of order (r+1) or higher than r are zero.

Rank is denoted as P(A).

if A is a non-singular matrix of order n, then rank of A = n i.e. P(A) = n.

**Properties of rank of a matrix:**

- If A is a null matrix then P(A) = 0 i.e. Rank of null matrix is zero.
- If I
_{n}is the nxn unit matrix then P(A) = n. - Rank of a matrix A mxn , P(A) <= min(m,n). Thus P(A) <=m and P(A) <= n.
- P(A
_{ nxn}) = n if |A| ≠ 0 < n if |A|=0. - If P(A) = m and P(B)=n then P(AB) <= min(m,n).
- If A and B are square matrices of order n then P(AB) ≥ P(A) + P(B) – n.
- If A
_{m×1}is a non zero column matrix and B_{1×n}is a non zero row matrix then P(AB) = 1. - The rank of a skew symmetric matrix cannot be equal to one.

**System of homogeneous linear equations AX = 0**.

- X = 0. is always a solution; means all the unknowns has same value as zero. (This is also called trivial solution)
- If P(A) = number of unknowns, unique solution.
- If P(A) < number of unknowns, infinite number of solutions.

**System of non-homogenous linear equations AX = B**.

- If P[A:B] ≠ P(A), No solution.
- If P[A:B] = P(A) => the number of unknown variables, unique solution.
- If P[A:B] = P(A) < number of unknown, infinite number of solutions.

Here P[A:B] is rank of gauss elimination representation of AX = B.

There are two states of Linear equation system:

**Consistent State:**A System of equations having one or more solutions is called a consistent system of equations.**Inconsistent State:**A System of equations having no solutions is called inconsistent system of equations.

**Linear dependence and Linear independence of vector:**

**Linear Dependence:** A set of vectors X_{1} ,X_{2} ….X_{r} is said to be linearly dependent if there exist r scalars k_{1} ,k_{2} …..k_{r} such that: k_{1} X_{1} + k_{2}X_{2} +……..k_{r} X_{r} = 0.

**Linear Independence:** A set of vectors X_{1} ,X_{2}….X_{r} is said to be linearly dependent if there exist r scalars k_{1},k_{2} …..k_{r}such that: k_{1}X_{1}+ k_{2} X_{2}+……..k_{r}X_{r} = 0 implies k_{1} = k_{2} =……. = k_{r} = 0.

**How to determine linear dependency and independency ??**

Let X_{1}, X_{2} ….X_{r} be the given vectors. Construct a matrix with the given vectors as its rows.

- If the rank of the matrix of the given vectors is less than the number of vectors, then the vectors are linearly dependent.
- If the rank of the matrix of the given vectors is equal to number of vectors, then the vectors are linearly independent.

Reference:

http://www.dr-eriksen.no/teaching/GRA6035/2010/lecture2-hand.pdf

This article is contributed by **Nitika Bansal**.