Finite Dimensional Vector Spaces

Linear Combination

A linear combination of vectors in a vector space $\setv$ is a vector of the form \begin{align} a_{1}v_{1} + a_{2}v_{2} + \cdots + a_{n}v_{n}\end{align} where $v_{j} \in \setv$ and $a_{j} \in \field$.

Span

The span of a set of vectors $v_{1}, \ldots v_{2} \in \setv$ is a set of all vectors which are a linear combination of these vectors \begin{align} span(v_{1}, \ldots, v_{n}) = { a_{1}v_{1} + \ldots + a_{n}v_{n} | a_{1}, \ldots a_{n} \in \field }\end{align} The span of an empty list is defined to be the singleton set ${ 0 }$.

The span of a list of vectors is the smalles subspace of $\setv$ containing all the vectors in the list.

If $span(v_{1}, \ldots, v_{n}) = \setv$, we say that $v_{1}, \ldots, v_{n}$ spans $\setv$.

Finite Dimensional Space

A vector space $\setv$ is called finite dimensional if some list of vectors in $\setv$ spans the space. Any space which is not finite dimensional will be called as infinite dimensional space.

Linearly Independent

  • Let $v_{1}, \ldots v_{n}$ be a set of vectors. This set of vectors is said to be linearly independent if $a_{1}v_{1} + \ldots + a_{n}v_{n} = 0$ for $a_{1}, \ldots a_{n} \in \field$ only if $a_{1} = \ldots = a_{n} = 0$. Empty set is also declared to be linearly independent.

  • If a set of vectors is removed from a linearly independent list, the remaining vectors in the list are also independent. A list of vectors that is not linearly independent is known as linearly dependent. Any superset of a list of linearly dependent vectors will also be linearly dependent.

  • The length of a list of linearly independent vectors $<$ the length of the list of vectors that span the vector space.

  • In a list of linearly dependent vectors, it is possible to express at least one vector as a linear combination of the other vectors in that list. The converse also holds true which can be used to esatablish if a list of vectors is linearly independent or dependent.

  • If we express a vector as linear combination of a list of vectors that are linearly independent, the coefficients used in the process are unique. Assume that two representations of the same vector exist \begin{align} v &= \alpha_{1}v_{1} + \cdots + \alpha_{m}v_{m}\newline v &= \gamma_{1}v_{1} + \cdots + \gamma_{m}v_{m}\newline \Rightarrow 0 &= (\alpha_{1} - \gamma_{1})v_{1} + \cdots + (\alpha_{m} - \gamma_{m})v_{m}\newline \Rightarrow (\alpha_{1} - \gamma_{1}) &= \cdots = (\alpha_{m} - \gamma_{m}) = 0 \end{align} since the only possible way to get the $0$ vector is to have all the coefficients $0$.

Basis

A basis of $\setv$ is a list of vectors that is linearly independent and spans $\setv$. For $\field^{n}$, the list of vectors of length $n$ such that the $j^{th}$ element in the list contains all $0$s except the element at position $j$ which is $1$ is called the standard basis.

For a linear map that is applied on functions, the standard basis is usually the set of polynomials $1, x, x^{2}, \ldots, x_{n}$ unless stated otherwise.

For a list of vectors $v_{1}, \ldots, v_{n}$ to be a basis of a vector space $\setv$, any element $v \in \setv$ should be expressible as linear combination of the list of vectors $v = a_{1}v_{1} + \ldots + a_{n}v_{n}$ where $a_{1}, \ldots a_{n} \in \field$ and are unique. That is, there is no other list of coefficients that will yield vector $v$ using a linear combination of the vectors in the list.

This follows from the fact that for a linearly independent set of vectors, $0$ is only expressible by choosing all $a_{j} = 0$ in the previous equation. If we assume there are two combinations to get $v$, subtracting them should yield $0$ which means the list of coefficients is indeed unique.

Every spanning list of a vector space $\setv$ can be reduced to a basis by removing a set of vectors from that list that may be linearly dependent on the other set of vectors. For any two sets of bases of a vector space, the number of vectors in the basis will always be same.

Dimension

The dimension of a finite dimensional vector space is the length of the basis of the space. It is denoted by dim $\setv$. For instance, dim $F^{n} = n$ since the basis has a size $n$.

It follows that the dimension of a subspace of a vector space $\setv$ is $\leq$ the dimension of the vector space $\setv$. We know that any line passing through origin is a subspace of $\real^{2}$ and the dim of the line is 1 since all points on the line are just scalar multiples of a vector.

Dimensions of sum of subspaces can be written out as \begin{align} dim(U_{1} + U_{2}) = dim(U_{1}) + dim(U_{2}) - dim(U_{1} \cap U_{2})\end{align} analogous to how the size of union of two sets is determined. The same analogy as with sets can be extend to 3 or higher subspaces as well.