Linear Algebra: Vector Spaces

Vector Spaces

Linear algebra is the study of finite-dimensional vector spaces. But before we even get to finite-dimensional vector spaces, we need to explain what a vector space is in the first place. In order to do that, we first need to define the concept of a field.


field is a mathematical object made up of a set, often denoted $F$, and two operations, called addition $(+)$ and multiplication $(\cdot)$, that collectively fulfill the following list of requirements, known as the field axioms:

  1. Closure Under Addition: $a + b \in F$ for all $a, b \in F$
  2. Associativity of Addition: $a + (b + c)=(a + b) + c$ for all $a, b, c \in F$.
  3. Commutativity of Addition: $a + b = b + a$ for all $a, b \in F$.
  4. Additive Identity: There exists an element $0$ such that $a+0=a$ for all $a \in F$.
  5. Additive Inverse: For each element $a \in F$ there exists another element in $F$ denoted $-a$ such that $a + (-a) = 0$.
  6. Closure Under Multiplication: $a \cdot b \in F$ for all $a, b \in F$.
  7. Associativity of Multiplication: $a \cdot (b \cdot c)= (a \cdot b) \cdot c)$ for all $a, b, c \in F$.
  8. Commutativity of Multiplication: $a \cdot b = b \cdot a$ for all $a, b \in F$.
  9. Multiplicative Identity: There exists an element $1$ such that $a \cdot 1 = a$ for all $a \in F$.
  10. Multiplicative Inverse: For each element $a \neq 0 \in F$, there exists another element in $F$ denoted $a^{-1}$ such that $a \cdot a^{-1} = 1$.
  11. Distributivity: $a \cdot (b + c) = a \cdot b + a \cdot c$ for all $a, b, c \in F$.

Given this axiomatic salad, what familiar objects qualify as fields? The rational numbers, real numbers, and complex numbers are all fields. The proofs of these facts are the focus of other areas of mathematics, such as real analysis, so you can take them as given for now. If you're ever lost when reading about vector spaces defined in terms of fields rather than more specific objects (like the real numbers), just swap in "real numbers" for "field" - that should hopefully make things more concrete.

But why talk about fields, rather than things like real numbers directly? Fields are a nice abstraction of the algebraic properties that are common to many number systems, such as the rational, real, and complex numbers (although not the natural numbers or integers). Using this abstraction allows us to reason about math in a more general way. Rather than prove the same theorems each time for the rational, real, and complex numbers, we prove them once for all fields, and then we can substitute in whichever particular field we may be interested in in practice later on. And since there are more than these three fields out there, that saves a lot of work. The key thing to pay attention to is the list of field axioms, since it is these properties that will be used in linear algebra proofs.

Vector Spaces

We can now define vector spaces formally in terms of fields. A vector space is a set V, a field $F$, and two functions, called addition and scalar multiplication, that collectively fulfill the following requirements:

  1. Closure Under Addition: $u + v \in V$ for all $u, v \in V$. Mathematically, $+ : V \times V \rightarrow V$.
  2. Associativity of Addition: $u + (v + w) = (u + v) + w$ for all $u, v, w \in V$.
  3. Commutativity of Addition: $u + v = v + u$ for all $u, v \in V$.
  4. Additive Identity: There exists an element $0 \in V$ such that $u + 0 = u$ for all $u \in V$.
  5. Additive Inverse: For every element $v \in V$ there exists another element $-v \in V$ such that $v + (-v) = 0$.
  6. Closure under Scalar Multiplication: $c \cdot v \in V$ for all $c \in F$ and $v \in V$. Mathematically, $\cdot : F \times V \rightarrow V$. We shorten $c \cdot v$ to $cv$ as a matter of concise notation, since the meaning should always be clear from the context.
  7. Multiplicative Identity: There exists an element $1 \in F$ such that $1v = v$ for all $v \in V$.
  8. Distributivity: $a(u+v) = au + av$ and $(a+b)u = au + bu$ for all $a, b \in F$ and all $u, v \in V$.

Elements of a vector space are called vectors. Likewise, $V$ is said to be a vector space over the field $F$, and for common vector spaces (such as $\mathbb{R}^n$ over $\mathbb{R}$), the addition and scalar multiplication operations are understood to adhere to a default definition (given in the next section) unless explicitly defined otherwise.

Notation: Unless stated otherwise, the implied name of a vector space is $V$, and its implied field is $F$. This saves us from having to constantly write out "Let $V$ be a vector space over a field $F$." Likewise, we can also say things like "Let $v_1, \ldots, v_n$ be vectors..." without having to specify that they're vectors in $V$ (over the field $F$).


  1. The definition of a vector space requires that there be an additive identity $0$ such that $u+0=u$ for all $u \in V$, but it does not strictly say that there be only one element $0$ that fulfills the above requirement. Show that the additive identity of a vector space is in fact unique.

    The beginning proofs around mathematical objects have only the definition and the results of lower-level objects (such as fields in this case) to rely on. Nobody really messes around with things like associativity much, but in the beginning it's necessary to be explicit. So this proof will be done in two column fashion a la high school geometry.

    In order to prove the uniqueness of an element with a certain property, we must show that all other elements that have that property are in fact the same element. Here, we imagine there are two elements that are additive inverses of the same element and then show that they are in fact equal.

    Let $u$ and $v$ be vectors in $V$ that are both additive inverses of $w$.

    $u = u + 0$ Additive Identity
    $u = u + (w + v)$ Additive Inverse
    $u = (u + w) + v$ Associativity of Addition
    $u = 0 + v$ Additive Inverse
    $u = v + 0$ Commutativity of Addition
    $u = v$ Additive Identity
    Show Answer
  2. Prove that $-(-v)=v$ for all $v \in V$. In other words, prove that the additive inverse of an vector's additive inverse is in fact the original vector.

    As in the previous proof, we show that the two vectors $u$ and $-(-u)$ are equal.

    Let $u \in V$.

    $u + (-u) = 0$ Additive Inverse
    $-u + (-(-u)) = 0$ Additive Inverse
    $u + (-u) = -u + (-(-u))$ Equality of previous 2 steps
    $u + (u + (-u)) = u + (-u + (-(-u)))$ Addition on V
    $u + 0 = u + (-u + (-(-u)))$ Additive Identity
    $u + 0 = (u + (-u)) + (-(-u))$ Associativity of Addition
    $u + 0 = 0 + (-(-u))$ Additive Identity
    $u + 0 =-(-u) + 0$ Commutativity of Addition
    $u =-(-u)$ Additive Identity
    Show Answer
  3. Show that $0u=0$ for all $u \in V$. Note that the definition of a vector space does not explicitly give this multiplicative property of the additive identity.

    Let $u \in V$.

    $0u = (0 + 0)u$ Additive Identity
    $0u = 0u + 0u$ Distributive Property
    $0u -0u = (0u + 0u) -0u$ Add additive inverse to both sides
    $0 = (0u + 0u) -0u$ Additive Identity
    $0 = 0u + (0u - 0u)$ Associativity of Addition
    $0 = 0u + 0$ Additive Inverse
    $0 = 0u$ Additive Identity
    Show Answer
  4. Show that $a0 = 0$ for all $a \in F$.

    Let $a \in F$.

    $a0 = a(0 + 0)$ Additive Identity
    $a0 = a0 + a0$ Distributive Property
    $a0 - a0 = (a0 + a0) -a0$ Add additive inverse to both sides
    $0 = (a0 + a0) -a0$ Additive Identity
    $0 = a0 + (a0 - a0)$ Associativity of Addition
    $0 = a0 + 0$ Additive Inverse
    $0 = a0$ Additive Identity
    Show Answer
  5. Little Timmy is faced with the above problem - proving that $a0=0$ for all $a \in F$. However, Little Timmy has heard of vectors before and thinks he has a clever proof:

    Let $u \in V$.

    $0u = 0(u_1,u_2,\ldots,u_n)$
    $0u = [0\cdot u_1,0\cdot u_2,\ldots,0\cdot u_n]$
    $0u = [0,0,\ldots,0]$
    $0u = 0$

    Explain why the above proof is incorrect.

    The definition of vector spaces gives no hint as to the structure of the elements of $V$. Little Timmy has assumed that the vectors belong to the most common kinds of vector spaces, namely Euclidean spaces such as $\mathbb{R}^n$ and $\mathbb{C}^n$. However, vector spaces can also exist over more interesting structures, such as polynomial functions in $\mathbb{R}$.

    Show Answer
  6. Let $\varnothing$ denote the empty set. Show that $\varnothing$ does not form a vector space over any field $F$.

    Hint: Which of the properties in the definition of a vector space does it fail to satisfy?

    The additive identity $0$ is not an element of $\varnothing$, because by definition $\varnothing$ has no elements.

    Show Answer
  7. Show that $(-1)v=-v$ for all $v \in V$.

    $v + (-1)v = 1v + (-1)v$ Multiplicative identity
    $v + (-1)v = (1+(-1))v$ Distributive property
    $v + (-1)v = 0v$ Additive inverse
    $v + (-1)v = 0$ $0v=0$ for all $v \in V$
    $(v + (-1)v) -v = 0 -v$ Add $-v$ to both sides
    $(v + -v) + (-1)v = 0 -v$ Associativity and commutativity
    $0 + (-1)v = 0 -v$ Additive inverse
    $(-1)v = -v$ Additive identity
    Show Answer