Linear Algebra: Vector Spaces

Vector Spaces


Linear algebra is the study of finite-dimensional vector spaces. But before we even get to finite-dimensional vector spaces, we need to explain what a vector space is in the first place. The tricky thing is, vector spaces are defined in terms of more fundamentals mathematical objects called fields, which can themselves be described in terms of even lower level objects. What do we do? We define fields and see how far we can get:

field is a mathematical object made up of a set, often denoted $F$, and two operations, called addition $(+)$ and multiplication $(\cdot)$, that collectively fulfills the following list of requirements, known as the field axioms:

  1. Associativity of Addition: $a + (b + c)=(a + b) + c$ for all $a, b, c \in F$.
  2. Associativity of Multiplication: $a \cdot (b \cdot c)= (a \cdot b) \cdot c)$ for all $a, b, c \in F$.
  3. Commutativity of Addition: $a + b = b + a$ for all $a, b \in F$.
  4. Commutativity of Multiplication: $a \cdot b = b \cdot a$ for all $a, b \in F$.
  5. Additive Identity: There exists an element $0$ such that $a+0=a$ for all $a \in F$.
  6. Multiplicative Identity: There exists an element $1$ such that $a \cdot 1 = a$ for all $a \in F$.
  7. Additive Inverse: For each element $a \in F$ there exists another element in $F$ denoted $-a$ such that $a + (-a) = 0$.
  8. Multiplicative Inverse: For each element $a \neq 0 \in F$, there exists another element in $F$ denoted $a^{-1}$ such that $a \cdot a^{-1} = 1$.
  9. Distributivity: $a \cdot (b + c) = a \cdot b + a \cdot c$ for all $a, b, c \in F$.

 

So, given this axiomatic salad, what exactly qualifies as a field? The rational numbers, real numbers, and complex numbers are all fields. The proofs of these facts are left for some other realm of math, such as complex analysis, so you can take them as given for now. If you're ever lost when reading about vector spaces defined in terms of fields rather than more concrete things (like the real numbers), just swap in "real numbers" for "field" - that should hopefully make things more concrete and easier to understand.

The good news is that you don't need to know a whole lot about fields themselves to get started learning about vector spaces. However, by defining vector spaces in terms of fields rather than the real numbers (or even the complex numbers), we can apply what we learn to much more than just plain old numbers later on. For example, special categories of functions over the rational numbers can themselves be fields (but don't worry about those for now). What you do need to do is pay attention to the field axioms, since they will help you prove things about vector spaces.

We can now define vector spaces formally in terms of fields. A vector space is a set V, a field $F$, and two functions, called addition and scalar multiplication, that collectively fulfill the following requirements:

  1. Addition takes as input two elements of $V$, $u$ and $v$, and assigns them to an element of $V$ denoted $u+v$.
  2. Scalar multiplication takes as input an element $a \in F$ and an element $u \in V$ and assigns them to a new element in $V$ denoted $a \cdot v$, or $av$ for short.
  3. Commutativity of Addition: $u + v = v + u$ for all $u, v \in V$.
  4. Associativity of Addition: $u + (v + w) = (u + v) + w$ for all $u, v, w \in V$.
  5. Additive Identity: There exists an element $0 \in V$ such that $u + 0 = u$ for all $u \in V$.
  6. Multiplicative Identity: There exists an element $1 \in F$ such that $1u = u$ for all $v \in V$.
  7. Additive Inverse: For every element $v \in V$ there exists another element $-v \in V$ such that $v + (-v) = 0$.
  8. Distributivity: $a(u+v) = au + av$ and $(a+b)u = au + bu$ for all $a, b \in F$ and all $u, v \in V$.

As a matter of notation, elements of a vector space are called vectors. What a coincidence!

As another matter of notation, a vector space is often just referred to by the set containing the vectors, rather than the set, the field, and the two functions that technically come together to form the vector space. For example, rather than writing out $(\mathbb{R}^3, \mathbb{R}, +, \cdot)$ and perhaps going further to define $+$ and $\cdot$, writing $\mathbb{R}^3$ is clear and unambiguous (as is writing "3D Euclidean space").


Problems

  1. The definition of a vector space requires that there be an additive identity $0$ such that $u+0=u$ for all $u \in V$, but it does not strictly say that there be only one element $0$ that fulfills the above requirement. Show that the additive identity of a vector space is in fact unique.

    The beginning proofs around mathematical objects have only the definition and the results of lower-level objects (such as fields in this case) to rely on. Nobody really messes around with things like associativity much, but in the beginning it's necessary to be explicit. So this proof will be done in two column fashion a la high school geometry.

    In order to prove the uniqueness of an element with a certain property, you must show that all other elements that have that property are in fact the same element. Here, we imagine there are two elements that are additive inverses of the same element and then show that they are in fact equal.

    Let $V$ be a vector space and let $u$ and $v$ be vectors in $V$ that are both additive inverses of $w$.

    $u = u + 0$ Additive Identity
    $u = u + (w + v)$ Additive Inverse
    $u = (u + w) + v$ Associativity of Addition
    $u = 0 + v$ Additive Inverse
    $u = v$ Additive Identity
    Show Answer
  2. Prove that $-(-v)=v$ for all $v \in V$. In other words, prove that the additive inverse of an vector's additive inverse is in fact the original vector.

    As in the previous proof, we show that the two vectors $u$ and $-(-u)$ are equal.

    Let $u \in V$.

    $u + (-u) = 0$ Additive Inverse
    $u + (-u) + 0 = 0$ Additive Identity
    $u + (-u) = (-u) + (-(-u))$ Additive Inverse
    $(u + (-u)) + u = ((-u) + (-(-u))) + u$  
    $(u + (-u)) + u = u + ((-u) + (-(-u)))$ Commutativity of Addition
    $(u + (-u)) + u = (u + (-u)) + (-(-u))$ Associativity of Addition
    $0 + u = 0 + (-(-u))$ Additive Inverse
    $u = -(-u)$ Additive Identity
    Show Answer
  3. Show that $0u=0$ for all $u \in V$. Note that the definition of a vector space does not explicitly give this multiplicative property of the additive identity.

    Let $u \in V$.

    $0u = (0 + 0)u$ Additive Identity
    $0u = 0u + 0u$ Distributive Property
    $0u -0u = (0u + 0u) -0u$ Add additive inverse to both sides
    $0 = (0u + 0u) -0u$ Additive Identity
    $0 = 0u + (0u - 0u)$ Associativity of Addition
    $0 = 0u + 0$ Additive Inverse
    $0 = 0u$ Additive Identity
    Show Answer
  4. Show that $a0 = 0$ for all $a \in F$.

    Let $a \in F$.

    $a0 = a(0 + 0)a$ Additive Identity
    $a0 = a0 + a0$ Distributive Property
    $a0 - a0 = (a0 + a0) -a0$ Add additive inverse to both sides
    $0 = (a0 + a0) -a0$ Additive Identity
    $0 = a0 + (a0 - a0)$ Associativity of Addition
    $0 = a0 + 0$ Additive Inverse
    $0 = a0$ Additive Identity
    Show Answer
  5. Little Timmy is faced with the above problem - proving that $a0=0$ for all $a \in F$. However, Little Timmy has heard of vectors before and thinks he has a clever proof:

    Let $u \in V$.

    $0u = 0(u_1,u_2,\ldots,u_n)$
    $0u = (0\cdot u_1,0\cdot u_2,\ldots,0\cdot u_n)$
    $0u = (0,0,\ldots,0)$
    $0u = 0$

    Explain why the above proof is incorrect.

    The definition of vector spaces gives no hint as to the structure of the elements of $V$. Little Timmy has assumed that the vectors belong to the most common kinds of vector spaces, namely Euclidean spaces such as $\mathbb{R}^n$ and $\mathbb{C}^n$. However, vector spaces can also exist over more interesting structures, such as polynomial functions whose coefficients are in $F$.

    Show Answer
  6. Let $\varnothing$ denote the empty set. Show that $\varnothing$ does not form a vector space over any field $F$.

    Hint: Which of the properties in the definition of a vector space does it fail to satisfy?

    There exists not additive identity element $0 \in \varnothing$, because by definition $\varnothing$ has no elements.

    Show Answer
  7. Show that $(-1)v=-v$ for all $v \in V$.

    $v + (-1)v = 1v + (-1)v$ Multiplicative identity
    $v + (-1)v = (1+(-1))v$ Distributive property
    $v + (-1)v = 0v$ Additive inverse
    $v + (-1)v = 0$ $0v=0$ for all $v \in V$
    $(v + (-1)v) -v = 0 -v$ Add $-v$ to both sides
    $(v + -v) + (-1)v = 0 -v$ Associativity and commutativity
    $0 + (-1)v = 0 -v$ Additive inverse
    $(-1)v = -v$ Additive identity
    Show Answer