From First Principles
Tags | FormalMath 113 |
---|
What is this? ⭐
This is a first-principles build-up of the linear algebra definition, ending with linear maps
Definition: Fields
A field has at least two elements and has two binary operations +_ and *
- association, commutation, and distributive laws are satisfied with + and *
- a field has additive and multiplicative identities
- also has additive inverse of anything in the field, but it may not be unique
- it also has a multiplicative inverse (so integers are not a field)
The distributive property, additive identity, additive inverse are enough to qualify a field as a group
and an abelian
group if it's commutative
Finite fields
- finite number of elements in the set
- define as the set where is a prime number
- yes, it needs to be a prime number; you can prove that if it isn't a prime number, it is not a field
Definition: Vector Spaces
A vector space is defined over a scalar field and it contains a vector addition and scalar multiplication
- It is closed under vector addition and scalar multiplication
- vector space has unique additive identity (unlike fields; this is provable)
- vector space has unique additive inverses (this is provable)
Vector spaces can also just be matrices (i.e. your basis vectors are matrices). Vector spaces can be as broad as , or it can be as narrow as . Sometimes, they lie in-between, i.e. the overall space is but the vector space itself is some .
We define subspaces
as vector spaces that are subsets of some other vector space. For example, is the subspace of , if .
You can have vector spaces of functions, as long as the functions are linear
Function spaces
- define as the set of all functions in the interval
- this is a vector space because it has an additive operation and a scalar multiplication operation
- you can also think of this as an uncountably-infinite entry vector, and would be indexing the vector at location .
- this is why is no different than how we defined a function space. We are mapping an n-tuple to the output n-tuple vector, and each "mapping" determines what the vector will be
Subspace
- A subspace of vector space is another vector space, but it is closed under addition and scalar multiplication
- this means that a vector subspace must include the zero vector
Subspace sum
The sum are the set of all possible sums of the elements from .
Or, in mathematical terms:
- it is worth noting that it is NOT a union operation. Unions are not guarenteed to be a subspace
- consider two perpendicular lines. Each line can be a subspace, but the two perpenduclar lines are not a subspace
- a subspace sum will cover the whole area between the two perpendicular lines
Direct sums
- A direct sum means that every vector in can be written as in only one way
- this includes the zero vector; it is sufficient to show that the zero vector is uniquely made
- if are subspaces of , is a direct sum if and only if
- this is kinda intuitive (if there were any other common elements then you would be able to construct non-unique combinations
- in this case, we call to be
complements
of each other. Complements are not unique
Linear combination and span
- the
span
is just the set of all linear combinations of vectors
- the span of any vectors forms a subspace (closed under addition and multiplication)
Linear independence
- A set of vectors are linearly independent if the only choice of that makes is all zeros (i.e. only one trivial solution exists)
- In contrast, linear dependence is when you can find a non-trivial solution
Linear dependence lemma
f is linearly dependence, there must exist some in there such that
Length lemma
If is spanned by vectors there can't be more than linearly independent vectors
Basis
- a basis of a subspace is a list of linearly independent vectors whose span is
- a list of vectors is a basis of iff every can be written uniquely as a linear combination of the vectors in that list
Lemma: spanning sets
- spanning sets contain a basis
- in other words, every spanning list can be reduced to a basis (you do this by "kicking out" certain vectors until you get a linearly independent list of vectors
Lemma: Linearly independent lists
- every linearly indepdnent list of vectors in finite-dimensional vector space can be extended to a vector space
Lemma: direct sums
- for every subspace in vector space , it is always possible to find a such that
- this makes sense, because you can think of this as being "perpendicular" to the (think about it in 3d and it makes sense)
Dimension
- the
dimension
of any vector space is the number of (linearly independent) vectors in the basis
Basis and dimension lemma
If V is finite-dimensional, then every linearly independent list of vectors in with length is a basis of
Sum and intersection lemma
- this is intuitive because if there is any overlap, you end up double-counting the dimensions
Also:
Linear maps
Linear maps will map between and , where V is the domain and W is the codomain
and
To define a linear map all you need to do is create a collection of basis vectors in and define where the basis vectors go. this is because by the previous two properties, you can now transform any vector in !
-
Product of linear maps
If , then . You compose linear maps like functions
Linear maps take 0 to 0 (required by its properties)
You can also think of matrix multiplication as a linear map where is a matrix and is another matrix.
Vector space of linear maps
is the set of all linear maps from V to W.
Interestingly, is a vector space! This is because it's closed under addition ( and closed under scalar multiplication.
Linear maps and Matrices
Any linear map is a multiplication by a an matrix . More specifically:
If , and is a basis of V and is a basis of W, then the matrix has the following property:
-
Every column in the matrix representation of represents the transformation , in terms of the basis vectors in
Connecting to common matrices
Common matrices that you see in real life will be using the standard basis for both . So this will be a bit easier.