Linear Algebra Background (duplicate)

TagsTips
💡
Some of this stuff is redundant from previous classes!

When do we use eigenvectors?

Definitiveness

💡
Just because everything’s positive doesn’t mean that it’s PSD

A matrix is positive definite if xTAx>0x^TAx > 0, and positive semidefinite if xTAx0x^TAx \geq 0.

PSD does not necessarily imply that it is symmetric, although we often restrict our discussions to symmetric matrices

These things are equivalent for PSD (and therefore are necessary & sufficient conditions)

Other important properties for PSD (necessary, but not sufficient)

Inverses

Sufficient conditions (cheat sheet)

Here are some sufficient conditions

Pseudoinverse (moore Penrose)

One easiest way of undersatnding the pseudoinverse is that it solves Axb22||Ax - b||_2^2 by setting x=A+bx = A^+b.

But more concretely, the pseudoinverse does two things

  1. Projects the output onto the range of AA
  1. Moves this back into the non-null space of AA

This is not necessarily the true inverse, because the input could have had a component in the null space of AA, which is forever lost. The best input may not reach bb, because it may be outside of the range of AA. So this part is lost too.

We can compute the inverse by doing SVD and doing the reciprocal of the Σ\Sigma unless it’s zero. In that case, just put in a zero.

Symmetric matrices

A=ATA = A^T is a symmetric matrix. A=ATA = -A^T is an antisymmetric matrix. All symmetric and antisymmetric matrices are square.

Symmetric

The sufficient condition is that A=ATA = A^T.

If a matrix is symmetric, this means that

  1. Can be decomposed into UΣUTU\Sigma U^T where UU is the the eigenbasis (spectral theorem)
  1. Has orthogonal eigenbasis (spectral theorem)
  1. Can be factored into VVTVV^T. If Σ\Sigma is non-negative, the VV is real (derived from the factorization)
  1. Inverse is also symmetric

Note that symmetry does not guarentee invertibility as the eigenbasis may have eigenvalues of 00.

Side note: if a matrix is not symmetric, it can still be decomposed through SVD.

Sums of symmetric matrices

A+ATA + A^T is symmetric and AATA - A^T is antisymmetric. Furthermore, because A=12(A+AT)+12(AAT)A = \frac{1}{2} (A + A^T) + \frac{1}{2}(A - A^T), we can express any matrix AA as the sum of a symmetric and an antisymmetric matrix!!

Numerical methods

Flop Count, BLAS

A FLOP is a floating point operation, and

Different linear algebra operations have different flop counts

Systems of linear equations

If AA is invertible, then the solution to Ax=bAx = b is just A1bA^{-1}b. However, we almost never take the inverse directly because it is O(n3)O(n^3). There are other ways of implicitly computing the inverse.

Diagonal matrices are trivial to compute the inverse, and triangular matrices can be computed through forward substitution.

Orthogonal matrices are easy to invert, and permutation matrices are also trivial to invert because they are orthogonal. In fact, because permutation matrices are just ones and zeros, we can just read the permutations differently, so the cost of solving the equation is 0 flops; there is no computations needed.

If you can factor the matrix into simpler matrices, then you can invert each one individually. Sometimes if you do it correctly, the complexity drops. This is especially helpful if you want to solve Ax=bAx = b for multiple bb’s. A common factorization is A=PLUA = PLU, where L is lower-triangular and U is upper-triangular. This is around O((2/3)n3)O((2/3)n^3) for large nn. We can also factor the matrix through Cholesky factorization of A=LLTA = LL^T if AA is PSD. Here, LL is lower triangular with positive diagonals.

Block Elimination

You can express Ax=bAx = b as a block equation

And we can write x1x_1 as

which allows us to write the following

We call

the Shur complement.