Special Matrices
Tags |
---|
types of matrices
- Identity matrix:
- One matrix (matrix of ones):
- Diagonal matrix:
- Upper/lower triangular :
- Block matrices:
invertible matrices
are square matrices such that .
-
- an
orthogonal matrix
is a square matrix such that , and this is because the rows are orthonormal (think about it in terms of the dot product matrix product definition
- a
nilpotent matrix
is constructed such that for some positive, finite integer
Upper triangular matrices
Formally, this is the definiition of upper triangular
-
- or... is invariant under
think about this for a second and it should make sense
First, we can actually show that over the complex numbers, every operator has an upper triangular matrix. The proof is a little involved, so we won't show it here.
Upper triangular matrices are only invertible if the diagonals re all non-zero. this is because the determinant becomes zero otherwise.
Therefore, eigenvalues are just the entries on the diagonal for an upper-triangular matrix! This is because if you made any one of them zero, the matrix would become non-invertible.
Gram Matrix
We see a lot of or , which is commonly known as gram matrix
.
- Always PSD (super useful for convexity) [proof: becomes norm]
- No guarantees on invertibility (it could have a low rank)
- Symmetric, which means that there exists eigenbasis, etc
- If is orthogonal, then is an identity.
Interestingly, if represents a matrix of vector samples, is the empirical covariance matrix.
if has full column rank, then is invertible. Why?
- Let
- From our four fundamental subspaces, we note that the null space of is the orthogonal complement of the range of . Therefore, the range of can't be in the null space of , so we know that
- Now, because is injective (full column rank), we conclude that . Therefore, we conclude that is injective, and because is an operator, we conclude that is invertible.
This idea will be important when talking about left and right inverses in a second.
You can also conclude that is symmetric, and this is a result of the dot product interpretation of matrix multiplication. This also means that has an orthonormal basis of eigenvectors, which means that it can be expressed as
Outer Product
If we have a nx1 vector (standard), we can compute the matrix , which is an nxn matrix. Each element is just . This has rank , because is the SVD with a singular value.
In general, if we have an matrix , the matrix of has rank for the same reasons as outlined above.