Trace of a matrix :
Let A=[aij] nxn is a square matrix of order n, then the sum of diagonal elements is called the trace of a matrix which is denoted by tr(A). tr(A) = a11 + a22 + a33+ ……….+ ann
Properties of trace of matrix:
Let A and B be any two square matrices of order n, then
- tr(kA) = k tr(A) where k is a scalar.
- tr(A+B) = tr(A)+tr(B)
- tr(A-B) = tr(A)-tr(B)
- tr(AB) = tr(BA)
Solution of a system of linear equations:
Linear equations can have three kind of possible solutions:
- No Solution
- Unique Solution
- Infinite Solution
Rank of a matrix: Rank of matrix is the number of non-zero rows in the row reduced form or the maximum number of independent rows or the maximum number of independent columns.
Let A be any m x n matrix and it has square sub-matrices of different orders. A matrix is said to be of rank r, if it satisfies the following properties:
- It has at least one square sub-matrices of order r who has non-zero determinant.
- All the determinants of square sub-matrices of order (r+1) or higher than r are zero.
Rank is denoted as P(A).
if A is a non-singular matrix of order n, then rank of A = n i.e. P(A) = n.
Properties of rank of a matrix:
- If A is a null matrix then P(A) = 0 i.e. Rank of null matrix is zero.
- If In is the nxn unit matrix then P(A) = n.
- Rank of a matrix A mxn , P(A) ≤ min(m,n). Thus P(A) ≤m and P(A) ≤ n.
- P(A nxn ) = n if |A| ≠ 0
- If P(A) = m and P(B)=n then P(AB) ≤ min(m,n).
- If A and B are square matrices of order n then P(AB) = P(A) + P(B) – n.
- If Am×1 is a non zero column matrix and B1×n is a non zero row matrix then P(AB) = 1.
- The rank of a skew symmetric matrix cannot be equal to one.
System of homogeneous linear equations AX = 0.
- X = 0. is always a solution; means all the unknowns has same value as zero. (This is also called trivial solution)
- If P(A) = number of unknowns, unique solution.
- If P(A) < number of unknowns, infinite number of solutions.
System of non-homogeneous linear equations AX = B.
- If P[A:B] ≠P(A), No solution.
- If P[A:B] = P(A) = the number of unknown variables, unique solution.
- If P[A:B] = P(A) ≠ number of unknown, infinite number of solutions.
Here P[A:B] is the rank of Gauss Elimination representation of AX = B.
There are two states of the Linear equation system:
- Consistent State: A System of equations having one or more solutions is called a consistent system of equations.
- Inconsistent State: A System of equations having no solutions is called inconsistent system of equations.
Linear dependence and Linear independence of vector:
Linear Dependence: A set of vectors X1 ,X2 ….Xr is said to be linearly dependent if there exist r scalars k1 ,k2 …..kr such that: k1 X1 + k2X2 +……..kr Xr = 0.
Linear Independence: A set of vectors X1 ,X2….Xr is said to be linearly independent if for all r scalars k1,k2 …..kr such that k1X1+ k2 X2+……..krXr = 0, then k1 = k2 =……. = kr = 0.
How to determine linear dependency and independency ?
Let X1, X2 ….Xr be the given vectors. Construct a matrix with the given vectors as its rows.
- If the rank of the matrix of the given vectors is less than the number of vectors, then the vectors are linearly dependent.
- If the rank of the matrix of the given vectors is equal to number of vectors, then the vectors are linearly independent.
Reference:
http://www.dr-eriksen.no/teaching/GRA6035/2010/lecture2-hand.pdf
This article is contributed by Nitika Bansal.