Linear Transformation
A transformation is a function that takes in a vector and outputs another vector — not at all different from our typical AP Calculus BC-style vector-valued functions. Now, however, we're working with vectors that could be anything, not just Cartesian coordinates.
To denote that a transformation has domain and range , we write . A transformation (mathematically) is linear if for all scalars and scalars :
For any vector space with basis , the transformation defined by is linear, one-to-one, and onto, where is the coordinate vector of with respect to .
How do we describe these numerically? An interesting observation is that we only need to follow where the basis vectors end up — since the linear transformation preserves addition and multiplication (linear combination), a vector like transformed is the same linear combination of the transformed and as the original vector was of and ! So, a linear transformation is completely described by the coordinates where and land. This can be packaged into a matrix where each column describes the transformed locations of and . In the most general () case:
Ex. If and are vector spaces with bases and , respectively, and is linear, then there exists an matrix such that for every vector , . This matrix is obviously dependent on the bases and ; to reflect this dependence, we write , where the subscript reflects the chosen basis for the domain of the transformation, and the superscript reflects the chosen basis for the range of the basis.
Image of a Transformation
The image of a transformation is denoted , and is the subset of of all . For any linear transformation ,
- .
- is a subspace of .
- is onto if and only if .
If and is linear, then is onto if and only if is one-to-one.
Rank
Some squishes (*sigh*... transformations) are more "squishy" than others — 3d space squished into a line is more "squished" than 3d space squished into a plane, but both transformations have determinant . The rank describes this — it is the number of dimensions in the output. A transformation whose output is a line has a rank of , while a transformation whose output is a plane has a rank of .
Formally, the rank of a linear transformation is
Null Space/Kernel
The set of vectors that get transformed into the origin during a transformation. For a transformation with full rank, the null space is just the zero vector. For a rank zero transformation, the null space is -dimensional space. The kernel (or nullspace) of a transformation is:
For any transformation ,
- .
- is a subspace of . If is finite dimensional and is linear, then . A transformation is one-to-one if and only if .
Nullity
The dimension of the nullspace of is called the nullity of , and is denoted by .
Injective
A transformation is injective (one-to-one) if for all vectors , implies .
Surjective
A transformation is onto (or surjective) if . So is onto if every element in the range is in the image of .
Isomorphic
If is linear, onto, and one-to-one, then it is isomorphic, and an isomorphism exists between and , denoted . It is reflective, transitive, and symmetric. Since isomorphism is an equivalence relation, if and only if (there exists an isomorphism and ). This means that any -dimensional vector space is just a copy of any other -dimensional vector space, and any vector space (up to the names of its elements) is completely determined by its dimension.
Important Linear Transformations
- rotation counterclockwise:
- Shear ( remains fixed and moves to ):
- Counterclockwise rotation by around the origin:
Similar Matrices
Two matrices and are similar if there exists an invertible matrix such that . Similar matrices have the same rank, determinant, and trace.
As we'll see when we get to change of basis and determinants, similar matrices are matrices that represent the same linear transformation, but with different bases.
Similarity (denoted ) is transitive, symmetric, and reflexive.
Transformation Composition (Matrix Multiplication)
Applying one transformation, then another, is still a linear transformation! So, there is a matrix that describes this composition of transformations, and we can call it the "product" of the two original matrices. However, since linear transformations aren't commutative (and are functions), we read matrix multiplication from right to left (just like we read ). So how do we multiply matrices and ?
There are two ways to think about it. The first way is the intuitive way: the first column of shows where is transformed to, and the second shows where is transformed to. Then, just multiply the two vectors by to get where the two transformed vectors go:
And this shows us the second way of thinking about multiplying matrices: memorize the formula!
P.S. I do not support the second way.
If you recall, in an earlier section we talked about the derivative as a linear transformation, denoted by a matrix with respect to the standard basis of . Do you see how we arrived at that matrix now? We just followed where each basis vector went!