A square matrix whose all elements are zeros except the diagonal elements which are all 1.
Identity matrix of size $n \times n$ is denoted by $I_{n}$. \begin{align} I_{3} = \begin{bmatrix} 1 &0 &0\newline 0 &1 &0\newline 0 &0 &1 \end{bmatrix} \end{align}
Any matrix with all elements 0 is called a null matrix (it need not be square). A square null matrix of size $n$ is denoted by $O_{n}$
A square matrix in which all the elements below the main diagonal are 0, i.e., $a_{ij} = 0 : \forall i > j$.
A square matrix in which all the elements above the main diagonal are 0, i.e., $a_{ij} = 0 : \forall i < j$
Any square matrix which satisfies $A^{2} = A$. Null matrix and identity matrix are examples of such matrices.
A square matrix which satisfies $A^{2} = I$.
A matrix $A$ is said to be nilpotent of class $x$ if $x$ is the smallest index such that $A^{x} = O$ and $A^{x-1} \neq O$.
A square matrix is singular if its determinant is 0. Equivalently, a square matrix is non-singular if its determinant is non-zero. A singular matrix is not invertible.
A square matrix is symmetric if $a_{ij} = a_{ji} \quad \forall i,j$. Or simply, if $A^{T} = A$.
For any matrix A
A symmetric matrix is positive definite ($A > 0$) if all its eigenvalues are positive. Further, for such a matrix $x^{T}Ax > 0$ for all vectors $x$.
It is positive semi-definite ($A \geq 0$) if all its eigenvalues are non-negative. Further, for such a matrix $x^{T}Ax \geq 0$ for all vectors $x$.
If A is positive definite (semi-definite), then there exists $A^{1/2}$ that is also positive definite (semi-definite) such that $A^{1/2} A^{1/2} = A$. This follows from the fact that the eigenvectors of the matrix $A$ are orthogonal.
A square matrix is skew symmetric if $a_{ij} = a_{ji} : \forall i,j$. This is equivalent to saying $A^{T} = -A$.
Any square matrix can be expressed as a sum of a symmetric and a skew symmetric matrix \begin{align} A = \frac{A + A^{T}}{2} + \frac{A - A^{T}}{2} \end{align}
Orthogonal or orthonormal matrix is a matrix whose rows and columns are orthonormal vectors. An orthogonal matrix $Q$ will satisfy \begin{align} QQ^{T} &= Q^{T}Q = I\newline Q^{-1} &= Q^{T}\newline det(Q) &= \pm 1 \end{align}
The last one follows from the fact that \begin{align} 1 = det(I) = det(QQ^{T}) = det(Q)^{2} \end{align}