Introduction

TagsBasics

Optimization

Optimization is trying to minimize an objective function while following certain rules (inequality and equality constraint functions)

Types of optimization

Uses for optimization

Convex Optimization

Convex Optimization is a special form of the general optimization problem, where all equality constraints are linear and all inequality and main objective functions are convex, i.e. upward curvature. Now, this might sound easy, but we didn’t talk about how many contraints and variables. We might be dealing with thousands of variables and many, many contraints.

The cool part about convex optimization is that as soon as you can get a problem into convex form, you can pull the crank and get a solution. There is no art in finding this solution.

Solving convex optimization problems

There are many different convex optimization approaches that we’ll talk about. Interior point and first order methods are common, and these work out of the box without needing initial guesses or tuning. These are also quite lightweight and easy to implement.

Generally, there is no analytical solution, but there exists some efficient iterative algorithms.