From First Principles

TagsFormalMath 113

What is this? ⭐

This is a first-principles build-up of the linear algebra definition, ending with linear maps

Definition: Fields

A field has at least two elements and has two binary operations +_ and *

The distributive property, additive identity, additive inverse are enough to qualify a field as a group and an abelian group if it's commutative

Finite fields

Definition: Vector Spaces

A vector space is defined over a scalar field F\mathbb{F} and it contains a vector addition and scalar multiplication

Vector spaces can also just be matrices (i.e. your basis vectors are matrices). Vector spaces can be as broad as RnR^n, or it can be as narrow as {0}\{0\}. Sometimes, they lie in-between, i.e. the overall space is RnR^n but the vector space itself is some span(v1,,vk)\text{span}(v_1, …, v_k).

We define subspaces as vector spaces that are subsets of some other vector space. For example, span(v1,,vk)\text{span}(v_1, …, v_k) is the subspace of RnR^n, if n>kn> k.

You can have vector spaces of functions, as long as the functions are linear

Function spaces

Subspace

Subspace sum

The sum U1+...+UmU_1 + ... + U_m are the set of all possible sums of the elements from U1,...UmU_1, ...U_m.

Or, in mathematical terms: U+W={u+wuU,wW}U + W = \{u+ w | u\in U, w \in W\}

Direct sums

Linear combination and span

Linear independence

Linear dependence lemma

f v1,...vmv_1, ...v_m is linearly dependence, there must exist some vjv_j in there such that vjspan(v1,...vj1)v_j \in span(v_1, ...v_{j-1})

Length lemma

If VV is spanned by nn vectors there can't be more than nn linearly independent vectors

Basis

Lemma: spanning sets

Lemma: Linearly independent lists

Lemma: direct sums

Dimension

Basis and dimension lemma

If V is finite-dimensional, then every linearly independent list of vectors in VV with length dimV\dim V is a basis of VV

Sum and intersection lemma

dim(U1+U2)=dim(U1)+dim(U2)dim(U1U2)\dim(U_1 + U_2) = dim(U_1) + dim(U_2) - dim(U_1 \cap U_2)

Also: dim(U1U2)=dim(U1)+dim(U2)\dim(U_1 \oplus U_2) = \dim(U_1) + \dim(U_2)

Linear maps

Linear maps T:VWT: V → W will map between VV and WW, where V is the domain and W is the codomain

T(u+v)=Tu+TvT(u+v) = Tu + Tv and T(λv)=λTvT(\lambda v) = \lambda Tv

To define a linear map all you need to do is create a collection of basis vectors in VV and define where the basis vectors go. this is because by the previous two properties, you can now transform any vector in VV!

Product of linear maps

If T(U,V),S(V,W)T \in \mathcal(U, V), S\in \mathcal(V, W), then (ST)u=S(Tu)(ST)u = S(Tu). You compose linear maps like functions

Linear maps take 0 to 0 (required by its properties)

You can also think of matrix multiplication as a linear map T:VWT: V → W where VV is a matrix and WW is another matrix.

Vector space of linear maps

L(V,W)\mathcal{L}(V, W) is the set of all linear maps from V to W.

Interestingly, L\mathcal{L} is a vector space! This is because it's closed under addition ((T1+T2)(v)=T1v+T2v(T_1 + T_2)(v) = T_1 v + T_2v and closed under scalar multiplication.

Linear maps and Matrices

Any linear map T:FnFmT: F^n → F^m is a multiplication by a an m×nm\times n matrix AFm×nA \in F^{m\times n}. More specifically:

If TL(V,W)T \in \mathcal{L}(V , W), and v1,...vnv_1, ...v_n is a basis of V and w1,...wmw_1, ...w_m is a basis of W, then the matrix has the following property:

Every column in the matrix representation of TT represents the transformation viT(vi)v_i → T(v_i), in terms of the basis vectors in WW

Connecting to common matrices

Common matrices that you see in real life will be using the standard basis for both V,WV, W. So this will be a bit easier.