Matrix
The matrix of a linear map is defined with respect to a set of basis for and . The entries of are defined as
where and are the basis vectors of and respectively. is a -by- matrix and each element . Unless stated otherwise, we will use the standard basis.
It is important to note that the matrix of a linear map is always defined with respect to a set of basis vectors.
Addition and Multiplication
Addition of matrices of the same size is defined as matrix whose each element is the sum of the corresponding elements of the two matrices.
Thus, the matrix of sum of linear maps is the sum of matrices of those linear maps
The scalar multiplication with is same as multiplying each element of the original matrix with
Thus, the matrix of a scalar times a linear map is same as the scalar times the matrix of the linear map
We will denote the set of all by matrices with elements in by . This set of all matrices is also a vector space on the addition and scalar multiplication rules for matrices defined above. The basis for such space is the collection of all matrices with all but one element zeros, and one element 1. There are such matrices meaning dim .
Matrix Multiplication
Matrix multiplication is important to define in order to work with product of linear maps. We wish to have the following hold
For two matrices and of sizes -by- and -by- respectively, their matrix multiplication is defined as
Which is the sum product of the row of and column of . Notice that the matrices have one dimension same. It is a necessary condition for the multiplication to be valid and also implies that the range of is same as domain of . Furthermore, is of size -by-.
Matrix multiplication is not commutative (since the multiplication may not be defined or the products may not be the same otherwise), but is distributive and associative.
Inverse Linear Map
A linear map is invertible if there exists such that is the identity map on and is the identity map on . is said to be the inverse of with and , and there exists a unique inverse for a linear map (can be proven by contradiction).
We denote the unique inverse of a linear map with and they satisfy and .
A linear map is invertible if and only if it is both injective and surjective. To prove this, we first assume the linear map to be invertible and show that the latter part of statement is true and vice versa.
Two vector spaces are isomorphic if there is an isomorphism (an invertible linear map) from one vector space onto the other.
Operator
A linear map from a vector space onto itself is called an operator. denotes the set of all operators from onto itself. It is the same notation as .
On a finite dimensional vector space , any linear map is \textbf{invertible}, \textbf{injective} and \textbf{surjective}.
Rank of a Matrix
Suppose . Row and column ranks are two non negative integers defined as
- Row Rank: Dimension of the span of the rows of in
- Column Rank: Dimension of the spaan of the columns of in
But for any , row rank column rank (can be proved using duality). Hence, we simply use the word rank to denote the column rank of a matrix.
Rank is defined for a matrix, whereas dimension is defined for a vector space. This is the important distinction between the two. The rank can be found by counting the number of non-zero rows in the row echelon form of a matrix (see here).
Rank-Nullity Theorem
We restate the theorem from fundamental theorem of linear maps using the concepts of dimension and ranks. For any vector spaces and , and linear map
where dim is the number of basis vectors of the vector space . Usually, the dim is the number of columns in the matrix . For a matrix, the dimension of the null space will be determined by the number of basis vectors of the solution set to .
Block Matrix
Sometimes, block matrices can help simplify cumbersome operations. For the purpose of demonstration, consider a matrix of dimensions , which we break into four blocks: of size , of size , of size , and of size . Then,
Then, the calculation of is
We obtained using multiplication on the four elements of the block matrix. All the multiplications agree in terms of the dimensions. This technique is especially useful when the matrix can be broken in such a way that individual matrix are identity matrix, diagonal matrix etc.
Orthogonal Matrix
Orthogonal or \textbf{orthonormal} matrix is a matrix whose rows and columns are orthonormal vectors. An orthogonal matrix will satisfy
The last one follows from the fact that
Orthogonal matrix play an important role in QR decomposition and SVD.
Determinant
Determinant is defined for square matrices and represents a transformation from . Its defined as follows for and matrices
The term outside the bracket is the entry from the first row, and the entry inside the brackets is the determinant of the matrix formed by not considering the first row and the column corresponding to the entry outside the bracket. We put alternating and signs before each term. This way the definition extends to any matrix.
If the determinant of a matrix is non zero, then the matrix has full rank, or all its rows and columns are linearly independent, and the matrix is invertible.