11/12 in Linear Algebra. See all.
Eigenvectors
Eigenvectors of a transformation are vectors that stay on their
span during the transformation. There are two similar ways of
defining them: through transformations and matrices. The gist is
the same: a nonzero (important!) vector
Eigenvalue
An eigenvalue describes the scalar that the eigenvector is
multiplied by as a result of the transformation. Why are
eigenvectors important? Consider a 3d rotation — if you can find
an eigenvector, you have found the axis of rotation for that
transformation.
Eigenbasis
Whenever a matrix has zeroes everywhere except the diagonal,
it's called a diagonal matrix. It means that
every single basis vector is an eigenvector! Diagonal matrices
allow you to do a lot — computations with them are very easy.
But... isn't it unlikely that you'll get a diagonal matrix as
your transformation? Well, funny thing — if you can find a set
of eigenvectors that span space, you can change basis
to those eigenvectors to get a diagonal transformation
matrix!
Finding eigenthings
Theorems
There are a ton of results related to eigenvalues and
eigenvectors. Here are a few. Try to prove each one, or at least
understand why they're true.
Thm. If
Thm.
Thm. For a linear transformation
is a subspace of
Thm. If
Thm. For an upper or lower triangular matrix, the eigenvalues are the diagonal elements.
Thm.
Thm. If
Thm. If
Trace
The trace of an
Thm. Suppose
There are a few things to know.