Linear Algebra

Vector Spaces

5/13 in Linear Algebra. See all.

What are vectors?
They're arrows in space, right? This is what most people are taught in high school-level mathematics.

But. There are such things as...

Vector-ish Thingies
If you think about it, functions are just like vectors, no? You can add them. You can scale them. In essence, functions are just vectors with infinitely many coordinates, where the base is x,,x2,x,1x^\infty, \dots, x^2, x, 1 instead of [10]T,[01]T\begin{bmatrix}1 & 0 & \dots\end{bmatrix}^T,\begin{bmatrix}0 & 1 & \dots \end{bmatrix}^T. With this basis (for polynomials of degree nn or lower, the basis is called the standard basis of Pn\mathbb{P}^n), you can represent any polynomial as a linear combination of the basis vectors. For example, the polynomial 3x2+2x+13x^2+2x+1 can be represented as [321]T\begin{bmatrix}3 & 2 & 1\end{bmatrix}^T. Back to the question — what are vectors? The answer is, anything that follows the rules. Linear algebra is defined by 10 axioms, and all of the results that can be called "linear algebra" follow as a result of these 10 axioms. So, it doesn't matter what a "vector space" is — so long as it follows the axioms, the findings hold true.

Formal Definition of a Vector Space
A vector space is a set VV of objects (vectors) and a set of scalars (either R\mathbb{R} or C\mathbb{C}), together with two binary operations, vector addition \oplus and scalar multiplication \odot, that satisfy the following properties:

  • Addition:
    • (A1) Closure: For all u,vV,u,v\in \mathbb{V}, uvVu\oplus v\in \mathbb{V}.
    • (A2) Commutativity: For all u,vVu,v\in \mathbb{V}, uv=vuu\oplus v=v\oplus u.
    • (A3) Associativity: For all u,v,wVu,v,w\in \mathbb{V}, (uv)w=u(vw)(u\oplus v)\oplus w=u\oplus (v\oplus w).
    • (A4) Existence of an identity: There exists a zero vector in V\mathbb{V} (denoted 0\textbf{0}) such that for every vVv\in \mathbb{V}, v0=vv\oplus \textbf{0}=v.
    • (A5) Existence of inverse: There exists for every vVv\in \mathbb{V} a v-v such that v(v)=0v\oplus (-v)=0.
  • Multiplication:
    • (S1) Closure: For all uVu\in \mathbb{V} and all scalars α\alpha, αuV\alpha\odot u\in \mathbb{V}.
    • (S2) Associativity: For all uVu\in \mathbb{V} and scalars α,β\alpha, \beta, α(βu)=(αβ)u\alpha \odot(\beta\odot u)=(\alpha\cdot\beta)\odot u.
    • (S3) Non-scaling property: 11 is an identity element. For all vectors uVu\in \mathbb{V}, 1u=u1\odot u=u.
    • (S4) Distributivity (1): For all uVu\in \mathbb{V} and scalars α\alpha and β\beta, (α+β)u=αuβu(\alpha + \beta)\odot u=\alpha\odot u \oplus \beta\odot u. Note that "++" refers to standard addition of real and complex numbers.
    • (S5) Distributivity (2): For all u,vVu, v\in \mathbb{V} and scalars α\alpha, α(uv)=αuαv\alpha\odot(u\oplus v)=\alpha\odot u \oplus \alpha\odot v.

If R\mathbb{R} is the set of scalars for a vector space, that vector space is said to be a real vector space. If the set of scalars is C\mathbb{C}, the vector space is said to be a complex vector space, or a vector space over C\mathbb{C}. Any vector space defined over C\mathbb{C} with dimension nn can be viewed as a set over R\mathbb{R} with dimension 2n2n.

Subspaces
A subspace of Rn\mathbb{R}^n is some subset VV of Rn\mathbb{R}^n that satisfies the following properties:

  1. Identity: 0V\vec{\textbf{0}}\in \mathbb{V}.
  2. Closure under multiplication: {cxV    cR,xV}\left\{c\cdot\vec{x}\in \mathbb{V}\; \forall\; c\in \mathbb{R},\,\vec{x}\in \mathbb{V}\right\}.
  3. Closure under addition: {a+bV    a,bV}\left\{\vec{a}+\vec{b}\in \mathbb{V} \;\forall\; \vec{a}, \vec{b}\in \mathbb{V} \right\}.
  4. Non-emptiness: V\mathbb{V}\neq \emptyset.

These 4 properties can be simplified to one, closure under linear combination: For all v,wV and scalars c,d,cv+dwV.\text{For all } \vec{v}, \vec{w}\in \mathbb{V} \text{ and scalars } c, d, c\vec{v}+d\vec{w}\in \mathbb{V}. As it implies the 4 hold true. We'll see subspaces used more when we talk about eigenvectors and eigenvalues.

Properties of Vector Spaces

  • In any vector space, there is a unique zero vector.
  • Additive inverses are unique.
  • For any vVv\in \mathbb{V}, if vv=vv\oplus v=v, v=0v=\textbf{0}.
  • For any uVu\in \mathbb{V}, u0=0u\odot \textbf{0}=\textbf{0}.
  • For any scalar α\alpha, α0=0\alpha\odot \textbf{0}=\textbf{0}.
  • For any vVv\in \mathbb{V}, (1)v=v(-1)\odot v=-v.
  • For any vVv\in \mathbb{V}, v=(v)v=-(-v).
  • For any vVv\in \mathbb{V} and scalar α\alpha, if av=0a\odot v=0, either a=0a=0 or v=0v=\textbf{0}.

Spanning Set
A set SS is a spanning set of V\mathbb{V} if span(S)=Vspan(S)=\mathbb{V}. This means that "SS spans V\mathbb{V}." If SVS\subset \mathbb{V}, then:

  1. If span(S)=Vspan(S)=\mathbb{V}, then some subset of SS is a basis for V\mathbb{V}.
  2. If SS is linearly independent, then SS is a subset of some basis for V\mathbb{V}.

Lemmas:

  1. If V\mathbb{V} is a vector space and STVS\subset T\subset \mathbb{V}, then span(S)span(T)span(S)\subset span(T).
  2. If V\mathbb{V} is a vector space and SVS\subset \mathbb{V}, then span(span(S))=span(S)span(span(S))=span(S).
  3. If V\mathbb{V} is a vector space and S,TVS,T\subset \mathbb{V} such that STS\subset T, then span(S)span(T)span(S)\subset span(T).
  4. If v1,,vnv_1,\dots, v_n are vectors in vector space V\mathbb{V}, then
    1. span{v1,,vk,,vn}=span{v1,,vk+λvj,,vn}span\{v_1, \dots, v_k, \dots, v_n\}=span\{v_1,\dots, v_k+\lambda v_j, \dots, v_n\}
    2. span{v1,,vk,,vn}=span{v1,λvk,,vn}span\{v_1, \dots, v_k, \dots, v_n\}=span\{v_1,\lambda v_k, \dots, v_n\} (where λR\lambda\in\mathbb{R} and λ0\lambda \neq 0).

Basis
A basis for V\mathbb{V} is a linearly independent spanning set of VV. Basically a "minimal" spanning set. A vector space is finitely dimensional if it has a basis with finitely many elements.

Thm. If the set S={v1,v2,,vn}S=\left\{ v_1,v_2,\dots,v_n \right\} spans V\mathbb{V}, then any set TT with more than nn elements must be linearly dependent. From this it follows that if B={v1,v2,,vn}B=\left\{v_1, v_2,\dots, v_n \right\} is a basis for V\mathbb{V}, then every linearly dependent subset of V\mathbb{V} must have nn or fewer elements. Also, every basis for V\mathbb{V} has the same number of elements. The number of elements in a basis for V\mathbb{V} is called the dimension of V\mathbb{V}.

Bases are used to express any vector in the vector space. For any vector uV\vec{u}\in \mathbb{V}, there is only one way to express u\vec{u} as a linear combination of the basis vectors. The coefficients used in this linear combination are called the coordinates, and can be written as nn-tuples or column matrices. We'll use this a lot later when we talk about transformations and change of basis operations.


I wanted to talk a bit more about why we might want to use polynomials as vectors (or anything, for that matter). The truth is, linear algebra is a tool — a very powerful one. Consider taking the derivative of a polynomial. There is no "strictly math" way to implement it, in, say, Python. But if we consider the polynomial as a vector, we can take the derivative of the polynomial by simple matrix multiplication — which is easy-peasy in Python. Then, the derivative could be defined as A=[01000002000003000004],A=\begin{bmatrix}0 & 1 & 0 & 0 & 0&\dots\\0 & 0 & 2 & 0 & 0&\dots\\0 & 0 & 0 & 3 & 0&\dots\\0 & 0 & 0 & 0 & 4 & \dots\\\vdots & \vdots & \vdots & \vdots & \vdots & \ddots\end{bmatrix}, with respect to the basis {1,x,x2,x3,}\left\{1, x, x^2, x^3, \dots\right\}. So, the derivative of the polynomial p(x)=1+2x+3x2p(x)=1+2x+3x^2 is just Ap(x)=[0260]T=2+6xAp(x)=\begin{bmatrix}0&2&6&0&\dots\end{bmatrix}^T=2+6x. This is a very powerful tool, and it's why we use linear algebra. It's not just about vectors and matrices — it's about the power of abstraction and the ability to manipulate things in ways that we couldn't before.

If you're not sure how we came up with AA, you'll see it later when we talk about transformations. For now, just know that it's a way to represent the derivative of a polynomial as a matrix.