Linear Algebra: Linear Combinations
Bases
A basis is a set of linearly independent vectors that span a vector space. The plural of basis is bases. Bases are useful because they allow every vector in the vector space to be expressed as a unique linear combination of basis vectors. In contrast, a set of vectors that is merely linearly independent may not span the entire vector span, and a set of vectors that spans the vector space may admit of multiple, rather than unique, linear combinations for any particular vector in the vector space.
Problems
Show that $\{[0, 1], [1, 1]\}$ form a basis of $\mathbb{R}^2$.
We must show that $\{[0, 1] , [1, 1] \}$ span $\mathbb{R}^2$ and are linearly independent.
First, let $[a, b] \in \mathbb{R}^2$. Then
$ [a, b]= c_1[0, 1] + c_2 [1, 1] \\ [a, b] = [0, c_1] + [c_2, c_2] \\ [a, b] = [c_2, c_1 + c_2] \\ $
From here we have two equations for each vector component:
$ a = c_2 \\ b = c_1 + c_2 \\ $
Solving for $c_1$ gives the following:
$ b = c_1 + a \\ c_1 = b  a \\ $
Thus for any vector $[a, b] \in \mathbb{R}^2$ we can express it as $(ba)[0, 1]+ a[1, 1]$. Therefore $\text{span}\left(\{[0, 1], [1, 1] \}\right) = \mathbb{R}^2$.
Next, we show that neither vector is a linear multiple of the other. Proceed by contradiction. Assume there is an element $c \in \mathbb{R}$ such that $[0, 1] = c [1, 1] $. Then
$ [0, 1] = c [1, 1] \\ [0, 1] = [c, c] \\ 0 = c \\ 1 = c \\ $
Clearly $1 \neq 0$, so the two vectors are linearly independent and thus form a basis.
Determine whether $\{[1,3], [2,6]\}$ form a basis of $\mathbb{R}^2$.
The vectors in a basis must be linearly independent. However, $[2,6] = 2[1,3]$. Therefore the two vectors are not linearly independent, so they do not form a basis. Additionally, the vectors do not span $\mathbb{R}^2$, as there is no combination of the two that can produce any vector that is not a scalar multiple of $[1,3]$, such as $[1,2]$. So in fact these two vectors fail both conditions of a basis.
(Note: It's sufficient to show that any condition is not met, but sometimes it's fun to show more than one just to rub it in.)
Show that $0$ can never be a basis vector.
A basis must linearly independent, which means that each $v \in V$ can only be represented by a linear combination of basis vectors with unique coefficients. However, $0 = c0$ for any $c \in F$, so inclusion of $0$ in any set of vectors precludes it from being linearly independent and therefore precludes it from being a basis.
The set of vector $e_1 = [1, 0, \ldots, 0], e_2 = [0, 1, \ldots, 0], e_n = [0, 0, \ldots, 1]$ is the standard basis of vectors in $F^n$, where $F$ is a field such as $\mathbb{R}$ or $\mathbb{C}$. Prove that this set is indeed a basis.
In order to be a basis, the set of vectors needs to 1) span the vector space and 2) be linearly independent.
The proof that this set spans $F^n$ can be found in the Span section, and the proof that this set is linearly independent can be found in the Linear Independence section. Taken together, we conclude that the set indeed forms a basis of $F^n$.
Prove that a set of vectors $v_1, \ldots, v_n$ is a basis if and only if for every $v \in V$ there is a unique set of coefficients $c_1, \ldots, c_n$ satisfying the equation
$$v = c_1v_1 + \ldots + c_nv_n$$
If $v_1, \ldots, v_n$ form a basis of $V$, then they span $V$, so each vector $v \in V$ can be written as a linear combination
$v=c_1v_1 + \ldots + c_nv_n$.
Because the vectors in a basis are linearly independent, the coefficients are unique.
Conversely, if every vector $v \in V$ can be written as a linear combination of vectors $v_1, \ldots, v_n$, then the vectors span $V$. If the coefficients of this linear combination is unique, then the set of vectors is linearly independent, and so they form a basis.
Show that every finite set of vectors that spans a vector space contains a basis as a subset.
Hint: Construct a subset that only contains linearly independent vectors.
Let $S = \{v_1, \ldots, v_n\}$ be a set of vectors that spans $V$. If this set is linearly independent, then the proof is done. If not, then construct a new set $B$ by first placing in it the first nonzero vector $v_k$. Then add $v_{k+1}$ to $B$ if $v_{k+1}$ is not in $\text{span}(B)$. Repeat this process, adding each successive $v_i$ to $B$ if it is not included included in $B$'s span, until all vectors in $S$ have been considered. Then the vectors in $B$ are linearly independent, because each vector is only included if it is not a linear combination of the other vectors previously added included in $B$. Since only those vectors which are linear combinations of other elements of $B$ have been removed, $B$ spans $V$ by the Linear Dependence Lemma. Therefore $B$ is a basis.
Let $B$ be a basis for a vector space $V$. Let $b$ be any basis vector in $B$. Show that $B  \{b\}$ is not a basis for $V$.
Because $B$ is a basis, its vectors are linearly independent. That means there is no linear combination in $B  \{b\}$ that equals $b$. Therefore $b \notin \text{span}\left(B  \{b\}\right)$. Thus $\text{span}\left(B  \{b\}\right) \neq \text{span}(V)$, so $B  \{b\}$ is not a basis of $V$.
Let $B$ be a set of vectors. Show that the following statements are equivalent:

$B$ is a basis.

$B$ is a maximal linearly independent set: $B$ is linearly independent and no proper superset of $B$ is linearly independent.

$B$ is a minimal spanning set: $B$ spans $V$ and no proper subset of $B$ spans $V$.
We show that 1 implies 2, 2 implies 1, and 1 implies 3, and 3 implies 1.
$1 \implies 2$: Let $B$ be a basis. Then $B$ is linearly independent and spans $V$. Let $v \notin B$. Then $v$ is equal to a unique linear combination of vectors in $B$. Therefore $B \cup \{v\}$ is linearly dependent.
$2 \implies 1$: Let $B$ be a maximal linearly independent set. Proceed by contradiction. Assume there is a $v \in V$ such that $v$ cannot be expressed as a linear combination of vectors in $B$. Then $B \cup \{v\}$ is linearly independent. But this is a contradiction, since $B \subset (B \cup \{v\})$. Therefore $v$ can be expressed by a linear combination of vectors in $B$ after all. Therefore $B$ spans $V$, and so $B$ is a basis.
$1 \implies 3$: Let $B$ be a basis. Then $B$ is linearly independent and spans $V$. Let $b \in B$. Then $b$ is not a linear combination of the other vectors in $B$. Therefore $b \notin \text{span}(B\{b\})$, so $B  \{b\}$ does not span $V$.
$3 \implies 1$: Let $B$ be a minimal spanning set of $V$. Proceed by contradiction. Assume there is a $v \in B$ such that $v$ can be expressed as a linear combination of the other vectors in $B$. Then $v \in \text{span}(B\{v\})$. But then $\text{span}(B) = \text{span}(B\{v\})$, which is a contradiction. Therefore no $v \in B$ can be expressed as a linear combination of the other vectors in $B$ after all, so $B$ is linearly independent. Therefore $B$ is a basis.

Show that every basis for a finite vector space has the same number of vectors in it.
Let $B_1$ and $B_2$ be bases of $V$. Then $B_1$ is linearly independent and $B_2$ spans $V$. Because every linearly independent set of vectors has at most as many vectors as a spanning set of vectors, $B_1 \leq B_2$. Likewise, $B_2$ is linearly independent and $B_1$ spans $V$, so by the same logic, $B_2 \leq B_1$. Therefore $B_1 = B_2$.
Show that every finite spanning set of vectors has a basis as a subset.
Let $S = \{s_1, \ldots, s_n\}$ be a set of vectors that span $V$. If $S$ is linearly independent, $S$ is a basis and the proof is done. If not, then $S$ is linearly dependent, so there is a vector $s_1 \in S$ that is a linear combination of the other vectors in $S$. By the Linear Dependence Lemma, $S_1 = S  \{s_1\}$ spans $V$. In general, if $S_{i}$ is not linearly independent, we can remove another vector $s_{i+1}$ that is a linear combination of the other vectors in $S_{i}$ to form $S_{i+1}$, which still spans $V$. This process of removal repeats at most $n1$ times, as a set containing a single nonzero vector is linearly independent. The resulting final subset $S_m$ is then a a basis, as it is linearly independent set and spans $V$.
Show that every finitedimensional vector space has a basis.
A finitedimensional vector space is by definition spanned by a finite set of vectors. By the foregoing proof, this set contains a basis as a subset.
Show that every linearly independent set of vectors in a finitedimensional vector space is a subset of a basis.
Let $A$ be a set of linearly independent vectors, and let $B$ be a basis of $V$. If $A$ spans $V$, then $A$ is a basis. Otherwise, $\text{span}(A)^c$ is nonempty, so there is a vector $v_1 \in \text{span}(A)^c$ that is not a linear combination of the vectors in $V$. Form $A_1 = \{v_1\} \cup A$. If $A_1$ spans $V$, then the proof is done. In general, if $A_i$ does not span $V$, then add another vector $v_i \in \text{span}(A_i)^c$ to form $A_{i+1}$. This process it guaranteed to terminate, since by definition a vector space is spanned by a finite number of vectors, and a set of linearly independent vectors can have as most as many vectors as a spanning set of vectors. The resulting set $A_n$ is linearly independent, and since there is no vector in $\text{span}(A_n)^c$, $A_n$ spans $V$. Therefore $A_n$ is a basis of $V$.
Show that a set of linearly independent vectors with the same cardinality as a basis is itself a basis.
Let $A$ be a set of $n$ linearly independent vectors, and let $B$ be a basis with $n$ vectors in it. By the foregoing proof, there is a superset of $A$ that is a basis. Because every basis has the same cardinality, no strict superset of $A$ is a basis. Therefore the only superset of $A$ with $n$ vectors in it is $A$ itself. Therefore $A$ is a basis.