Quantcast
Channel: Jerry Zitao Liu
Viewing all articles
Browse latest Browse all 41

Local minima are global minima in convex optimization.

$
0
0

Probably, everyone in or partially in machine learning or data mining heard about the quote “There is no local minima for convex problem.” Today I would like to show the mathematical proof of this statement.

Before I start the proof, I would like to make things clear:

  • What is convex optimization?
  • What does the following notation mean?

The convex optimization problems are defined as follows:

where and are convex and is the feasible set, i.e., .

Now lets look at how to prove the following statement. It turns out that the proof is pretty easy by using “proof by contradiction”.

Theorem.

For convex optimization problems, local minima are global minima.

Proof.

Let be the local minima with radius . Since we assume the statement is incorrect, there is some feasible , such that and .

Let’s construct in the following way:

, where .

First, due to the fact that both and are feasible points in and is a convex set.

Then, satisfy all the constraints ( and ). Very easy to check and remember s are convex.

Up to now, we have a local minima (with radius ), a point outside the -neighborhood, and , which on the line of and . We can take small enough to bring into the ’s -neighborhood, which means is in ’s local area.

Since is a convex function, we have

Here we see, is less than which contradicts with the assumption that is the local minima. This means the assumption is wrong, which concludes the proof .


Viewing all articles
Browse latest Browse all 41

Trending Articles