Vector
The first, most basic concept in linear algebra. A vector can be anything, as we'll soon learn. At its most basic, a vector is a list of scalars. For example, in , a vector is a list of two scalars. In , a vector is a list of three scalars. In , a vector is a list of scalars. A vector can be represented as a column matrix, or as a list of scalars separated by commas. For example, the vector can also be written as .
Real Coordinate Spaces ()
If you've taken analysis, you're familiar with a Cartesian coordinate system — every point is a unique combination of real numbers. This is a real coordinate space. The set of all -tuples of real numbers is denoted . For example, is the set of all 2-tuples of real numbers, and is the set of all 3-tuples of real numbers. Here's the general form of a vector in :
Note that the "points" we've always worked with in high school, like , are actually vectors in . On another note, there also exist complex coordinate spaces (), where the scalars are complex numbers — but we won't focus on them.
Zero Vector
The zero vector is a vector whose components are all zero. It is denoted by , and exists in every real coordinate space.
Adding Vectors
For vectors with identical dimensions, just add the corresponding components:
If the dimensions are different, you can (if needed) extend the vector in the lower space by adding zeroes to its higher dimensions (like with polynomials — a quadratic will have a coefficient of on the term).
Multiplying by a Scalar
Multiplying vectors by a scalar is easy! Just multiply each component by the scalar.
Unit Vector
A vector with a magnitude of . To normalize vectors, divide each component by the magnitude ().
Standard Basis
Until we formally cover bases, keep in the back of your mind that every vector in is a combination of the standard basis vectors (which we'll call , , , etc.). For example, in , the standard basis vectors are, in order, and .
Dot Product — The projection of onto times the length of . The dot product can be negative (the length of the projection would be if it were in the opposite direction of )! Calculated by multiplying the entries in each vector and adding them up: Perpendicular vectors have dot product because the length of the projection is . We can use dot products to determine whether vectors are facing in generally the same direction (dot product ), perpendicular (dot product ), or different directions (dot product ). This is just something you may come across in other branches of mathematics, but we won't need dot products much in linear algebra.
Now we need to define some terms that will be useful in the future.
Linear Combination
If you have some vectors in (real space). A linear combination is some linear combination of these vectors, where each vector is scaled by a real constant:
As mentioned earlier, every vector in can be represented as a linear combination of the standard basis vectors — but we'll see this term pop up a lot more!
Linear Dependence
A set that is linearly dependent is a set where a member vector can be represented as a linear combination of other vectors in the set (i.e., that vector doesn't add any new "dimensionality" to the set). A more formal definition: a set is linearly dependent if and only if
Where not all are zero. Or, a set is linearly independent if the only solution to the above equation is .
Span
A span is defined as the set of all linear combinations of a set of vectors. For example, two linearly independent vectors span a plane (). The span of a set of collinear -tuples is the line on which they are collinear. Notation: