Gauss-Newton Method + Regularized Least Squares

TagsClosed Form

Multi-objective least squares

Sometimes, you want to find an xx that satisfies more than one constraints, like we want to minimize Axy2||Ax - y||^2 and Fxg2||Fx - g||^2.

Usually, these objectives are competing, and we want to satisfy them fairly.

The best solutions are called Pareto optimal points. We can derive some solution through a weighted objective

We can express this as a normal weighted sum objective by essentially stacking the things on top of each other

which means that the least squares solution is the exact same approach

Regularized least squares

In regularized least-squares, the second objective is to keep x||x|| as small as possible, which means that the weighted-sum objective minimum is

Gauss-Newton Method

Our current objective can be expressed more generally as

where r(x)=Axyr(x) = Ax - y. But what if the residual wasn’t linear?

In this setup, it’s very hard to solve exactly, but we can compute locally-optimal solutions and then repeat until convergence. To do this, we compute the jacobian of the residual and use the jacobian as the AA.

which means that the setup is

which you can solve in the current iteration through normal least squares. The idea here is that this algorithm will push you closer to the true objective. Unfortuantely, convergence isn’t guaranteed, but it can happen

You can also regularize in the GN method to make sure that your next iteration is close tot he current iteration