Linear Algebra

Vectors and Coordinate Spaces

2/13 in Linear Algebra. See all.

Vector
The first, most basic concept in linear algebra. A vector can be anything, as we'll soon learn. At its most basic, a vector is a list of scalars. For example, in R2\mathbb{R}^2, a vector is a list of two scalars. In R3\mathbb{R}^3, a vector is a list of three scalars. In Rn\mathbb{R}^n, a vector is a list of nn scalars. A vector can be represented as a column matrix, or as a list of scalars separated by commas. For example, the vector v=[12]\vec{v}=\begin{bmatrix}1\\2\end{bmatrix} can also be written as v=[1,2]\vec{v}=\begin{bmatrix}1, 2\end{bmatrix}. v=5,0=[30]\vec{v}=\langle5, 0\rangle=\left[ {\begin{array}{c} 3\\ 0\\ \end{array} } \right]

Real Coordinate Spaces (Rn\mathbb{R}^n)
If you've taken analysis, you're familiar with a Cartesian coordinate system — every point is a unique combination of nn real numbers. This is a real coordinate space. The set of all nn-tuples of real numbers is denoted Rn\mathbb{R}^n. For example, R2\mathbb{R}^2 is the set of all 2-tuples of real numbers, and R3\mathbb{R}^3 is the set of all 3-tuples of real numbers. Here's the general form of a vector in Rn\mathbb{R}^n: v[a1a2...an]Rn.\vec{v}\left[ {\begin{array}{c}a_1 \\a_2 \\...\\a_n\end{array}}\right]\in\mathbb{R}^n. Note that the "points" we've always worked with in high school, like (3,4)(3, 4), are actually vectors in R2\mathbb{R}^2. On another note, there also exist complex coordinate spaces (Cn\mathbb{C}^n), where the scalars are complex numbers — but we won't focus on them.

Zero Vector
The zero vector is a vector whose components are all zero. It is denoted by 0\textbf{0}, and exists in every real coordinate space.

Adding Vectors
For vectors with identical dimensions, just add the corresponding components: [62]+[45]=[23]\begin{bmatrix} 6\\ -2 \end{bmatrix} + \begin{bmatrix} -4\\ 5\end{bmatrix} = \begin{bmatrix} 2\\ 3\end{bmatrix} If the dimensions are different, you can (if needed) extend the vector in the lower space by adding zeroes to its higher dimensions (like with polynomials — a quadratic will have a coefficient of 00 on the x5x^5 term).

Multiplying by a Scalar
Multiplying vectors by a scalar is easy! Just multiply each component by the scalar. 2[123]=[246]2\cdot \left[ \begin{array}{c}1 \\2 \\3\end{array}\right]=\left[\begin{array}{c}2 \\4 \\6\end{array}\right] Unit Vector
A vector with a magnitude of 11. To normalize vectors, divide each component by the magnitude (×1v\times \frac{1}{||\vec{v}||}).

Standard Basis
Until we formally cover bases, keep in the back of your mind that every vector in Rn\mathbb{R}^n is a combination of the standard basis vectors (which we'll call i^\hat{\textbf{i}}, j^\hat{\textbf{j}}, k^\hat{\textbf{k}}, etc.). For example, in R2\mathbb{R}^2, the standard basis vectors are, in order, [10]\begin{bmatrix}1\\0\end{bmatrix} and [01]\begin{bmatrix}0\\1\end{bmatrix}.

Dot Productvw\vec{v}\cdot\vec{w} The projection of w\vec{w} onto v\vec{v} times the length of v\vec{v}. The dot product can be negative (the length of the projection would be ×(1)\times(-1) if it were in the opposite direction of v\vec{v})! Calculated by multiplying the entries in each vector and adding them up: [abc][def]=ad+be+cf\begin{bmatrix}a \\ b \\ c\end{bmatrix}\cdot \begin{bmatrix}d \\ e \\ f\end{bmatrix}=ad+be+cf Perpendicular vectors have dot product 00 because the length of the projection is 00. We can use dot products to determine whether vectors are facing in generally the same direction (dot product >0>0), perpendicular (dot product =0=0), or different directions (dot product <0< 0). This is just something you may come across in other branches of mathematics, but we won't need dot products much in linear algebra.


Now we need to define some terms that will be useful in the future.

Linear Combination
If you have some vectors v1,v2,...,vn\vec{v_1}, \vec{v_2},...,\vec{v_n} in Rm\mathbb{R}^m (real space). A linear combination is some linear combination of these vectors, where each vector is scaled by a real constant: a1v1+a2v2+...+anvn.a_1\vec{v_1}+a_2\vec{v_2}+...+a_n\vec{v_n}. As mentioned earlier, every vector in Rn\mathbb{R}^n can be represented as a linear combination of the standard basis vectors — but we'll see this term pop up a lot more!

Linear Dependence
A set that is linearly dependent is a set where a member vector can be represented as a linear combination of other vectors in the set (i.e., that vector doesn't add any new "dimensionality" to the set). A more formal definition: a set SS is linearly dependent if and only if c1v1+c2v2+...+cnvn=0c_1\vec{v_1}+c_2\vec{v_2}+...+c_n\vec{v_n}=\textbf{0} Where not all c1...cnc_1...c_n are zero. Or, a set is linearly independent if the only solution to the above equation is c1=c2=...=cn=0c_1=c_2=...=c_n=0.

Span
A span is defined as the set of all linear combinations of a set SS of vectors. For example, two linearly independent vectors span a plane (R2\mathbb{R}^2). The span of a set of collinear 22-tuples is the line on which they are collinear. Notation: span({0})={(0,0)}\text{span}(\left\{ \vec{\textbf{0}}\right\})= \left\{ (0,0) \right\}