What are vectors?
They're arrows in space, right? This is what most people are taught in high school-level mathematics.
But. There are such things as...
Vector-ish Thingies
If you think about it, functions are just like vectors, no? You can add them. You can scale them. In essence, functions are just vectors with infinitely many coordinates, where the base is instead of .
With this basis (for polynomials of degree or lower, the basis is called the standard basis of ), you can represent any polynomial as a linear combination of the basis vectors. For example, the polynomial can be represented as .
Back to the question — what are vectors? The answer is, anything that follows the rules. Linear algebra is defined by 10 axioms, and all of the results that can be called "linear algebra" follow as a result of these 10 axioms. So, it doesn't matter what a "vector space" is — so long as it follows the axioms, the findings hold true.
Formal Definition of a Vector Space
A vector space is a set of objects (vectors) and a set of scalars (either or ), together with two binary operations, vector addition and scalar multiplication , that satisfy the following properties:
- Addition:
- (A1) Closure: For all .
- (A2) Commutativity: For all , .
- (A3) Associativity: For all , .
- (A4) Existence of an identity: There exists a zero vector in (denoted ) such that for every , .
- (A5) Existence of inverse: There exists for every a such that .
- Multiplication:
- (S1) Closure: For all and all scalars , .
- (S2) Associativity: For all and scalars , .
- (S3) Non-scaling property: is an identity element. For all vectors , .
- (S4) Distributivity (1): For all and scalars and , . Note that "" refers to standard addition of real and complex numbers.
- (S5) Distributivity (2): For all and scalars , .
If is the set of scalars for a vector space, that vector space is said to be a real vector space. If the set of scalars is , the vector space is said to be a complex vector space, or a vector space over . Any vector space defined over with dimension can be viewed as a set over with dimension .
Subspaces
A subspace of is some subset of that satisfies the following properties:
- Identity: .
- Closure under multiplication: .
- Closure under addition: .
- Non-emptiness: .
These 4 properties can be simplified to one, closure under linear combination: As it implies the 4 hold true. We'll see subspaces used more when we talk about eigenvectors and eigenvalues.
Properties of Vector Spaces
- In any vector space, there is a unique zero vector.
- Additive inverses are unique.
- For any , if , .
- For any , .
- For any scalar , .
- For any , .
- For any , .
- For any and scalar , if , either or .
Spanning Set
A set is a spanning set of if . This means that " spans ." If , then:
- If , then some subset of is a basis for .
- If is linearly independent, then is a subset of some basis for .
Lemmas:
- If is a vector space and , then .
- If is a vector space and , then .
- If is a vector space and such that , then .
- If are vectors in vector space , then
- (where and ).
Basis
A basis for is a linearly independent spanning set of . Basically a "minimal" spanning set. A vector space is finitely dimensional if it has a basis with finitely many elements.
Thm. If the set spans , then any set with more than elements must be linearly dependent. From this it follows that if is a basis for , then every linearly dependent subset of must have or fewer elements. Also, every basis for has the same number of elements. The number of elements in a basis for is called the dimension of .
Bases are used to express any vector in the vector space. For any vector , there is only one way to express as a linear combination of the basis vectors. The coefficients used in this linear combination are called the coordinates, and can be written as -tuples or column matrices. We'll use this a lot later when we talk about transformations and change of basis operations.
I wanted to talk a bit more about why we might want to use polynomials as vectors (or anything, for that matter). The truth is, linear algebra is a tool — a very powerful one. Consider taking the derivative of a polynomial. There is no "strictly math" way to implement it, in, say, Python. But if we consider the polynomial as a vector, we can take the derivative of the polynomial by simple matrix multiplication — which is easy-peasy in Python. Then, the derivative could be defined as with respect to the basis . So, the derivative of the polynomial is just . This is a very powerful tool, and it's why we use linear algebra. It's not just about vectors and matrices — it's about the power of abstraction and the ability to manipulate things in ways that we couldn't before.
If you're not sure how we came up with , you'll see it later when we talk about transformations. For now, just know that it's a way to represent the derivative of a polynomial as a matrix.