Transformations tend to stretch or shrink (squish) space. So how
do we measure by how much it shrinks? We can measure
the factor by which a given area increases or decreases.
Notation
Let
be the
submatrix of
obtained by removing the
th row and
th column of
.
Determinant
The determinant is the factor by which a given
transformation changes any area. The determinant is
if the transform transforms space into a lower dimension (for
example,
is transformed into a plane, or even a line). Well... ish.
Sorry. The determinant can be negative! This is when space is
"inverted".
Do you see why? We stretch the
-axis by
, and the
-axis by
, so the area of any shape is stretched by
.
The determinant () is more formally defined recursively:
-
If
is the
matrix
, then
.
-
If
is an
matrix (where
), then
For a matrix
, the scalar quantity
is called a minor of
, and
is called a cofactor. To compute the
determinant of
, we can
-
Compute the cofactor of each element in the first row.
-
Multiply each element in the first row by its cofactor and sum
the results.
We can do this for any row or column, not just the first one.
This obviously means that if a matrix has a zero row, the
determinant is zero. Similarly, for an upper triangular matrix,
the determinant is the product of the diagonal elements. The
same is true for diagonal and lower triangular matrices.
-
. Therefore, for any square matrix
,
.
-
If
has two identical rows or columns,
. That is, if
,
.
-
If
is the elementary matrix corresponding to interchanging two
rows of
, then
and
.
-
If
is the elementary row matrix that corresponds to multiplying a
row of
by a scalar
, then
and
.
-
If
is the elementary row matrix corresponding to adding
times row
of
to row
, then
and
.
-
For any two
matrices
and
,
.
-
From this theorem, it follows that if
is an invertible matrix,
.
-
Also, similar matrices have the same determinants (proven
by taking determinants of
).
-
Note that this means if
is a finite-dimensional vector space and
is linear, then
, defined as the determinant of any matrix
representation of
, is a well-defined scalar, independent of the choice
of basis.
A square matrix
is nonsingular if and only if its determinant is nonzero.
There's a proof, but think of it this way: a zero determinant
decreases the dimension of space (and therefore the kernel is
nontrivial). That means that the transformation isn't
invertible. The proof (vaguely) relies on the facts that no
elementary row operation has a zero determinant, and therefore
if
is singular, it can be transformed to an upper triangular matrix
with at least one zero on its diagonal (meaning it has a zero
determinant), so the determinant of
is
.