Thursday, October 27, 2011

Proof for the first-order condition of convexity

We want to prove that


A function f(x), which is differentiable, is convex if and only if its domain is a convex set and if the following inequality condition is satisfied:


Intuitively, this condition says that the tangent/first-order-taylor-series approximation is globally an under-estimator of f(x).

Proof:

Before beginning the proof, let us first review the fundamental definition of a convex function:


  • Definition of a convex function: A function f(x) is said to be convex if and only if its domain is a convex set and if it satisfies the following inequality condition:


Let us assume that f(x) is convex. Then, according to the fundamental definition of convex functions, the following inequality condition must be satisfied:

Now by a simple re-arrangement of terms we can rewrite Eq.(1) as follows:

 

Now, let

We now express Eq.(2) in terms of g(t), as shown below:

where we have used the fact that g(0) = f(x).

Now taking the limit as t –> 0 on both sides of Eq. (3), we get:

 

Now, we need to find g’(o). Instead, let us compute the more general g’(t) which is given below:

Now substituting “t = 0” in the above equation, we get:

Now, substituting the above result into Eq. (4), we get:

Notice that this is exactly the inequality condition we wanted to prove.

We have thus proved that:


"if the function f(x) is convex and differentiable then it is necessary that the inequality condition shown in Eq. (5) must be satisfied".
However, our proof is not yet complete

We now have to prove the "sufficiency" part of this condition i.e. we want to prove that:


"if a function f(x) satisfies the inequality condition in Eq. (5) then it sufficient to conclude that f(x) is a convex function". 
Showing this part will complete our proof.

To begin with, let us assume that the inequality condition shown in Eq. (5) is satisfied.

Now, consider

Notice that, since Domain(f) is a convex set, z should belong to Domain(f). Also, since we assumed that the inequality condition in Eq. (5) is satisfied, the following two inequalities must hold true:

Now multiplying the inequalities in Eqs. (6) and (7) with t and (1-t) respectively and adding the results we get:

Observe that this is exactly the inequality that f(x) must satisfy in order to be considered as a convex function. Hence, we have proved that, if f(x) satisfies the inequality condition in Eq.(5) then it is sufficient to conclude that f(x) is a convex function.

This ends our proof of the first-order condition of convexity.

Friday, October 21, 2011

Why is the error function minimized in logistic regression convex?

We want to prove that the error/objective function of logistic regression :

is convex.

Proof:

Before beginning the proof, i would first like to make you review/recollect a few definitions/rules/facts/results related to convex functions:


  • Definition of a convex function: A function f(x) is said to be convex if the following inequality holds true:


  • First-order condition of convexity: A function f(x) which is differentiable is convex if the following inequality condition holds true:

Intuitively, this condition says that the tangent/first-order-taylor-series approximation of f(x) is globally an under-estimator.

  • Second-order condition of convexity: A function f(x) which is twice-differentiable is convex if and only if its hessian matrix (matrix of second-order partial derivatives) is positive semi-definite, i.e.

     
  • Sum/Linear-combination of two or more convex functions is also convex: Let f(x) and g(x) be two convex functions. Then any linear combination of these two functions



    is also a convex function (this can be easily proved using the definition of the convex function).

Now notice that if we can prove that the two functions



are convex, then our objective function



must also be convex since any linear combination of two or more convex functions is also convex.

Let us now try to prove that



is a convex function of theta. In order to do this, we will use the second-order condition of convexity described above. Let us first compute the hessian matrix:

Now below is the proof that this hessian matrix is positive semi-definite:

Let us now try to prove that



is a convex function of theta. In order to do this, we will again use the second-order condition of convexity described above. Let us first compute its hessian matrix:

Above, we have proved that both

are convex functions. And, the error/objective function of logistic regression

is essentially a linear-combination of several such convex functions. Now, since a linear combination of two or more convex functions is convex, we conclude that the objective function of logistic regression is convex.

Hence proved …

Following the same line of approach/argument it can be easily proven that the objective function of logistic regression is convex even if regularization is used.

Wednesday, July 20, 2011

Fermat' Theorem on Stationary Points - Why should the derivative vanish at Local/Global Extrema ?

Fermats Theorem Maxima Minima

Proofs of Area and Circumference of a Circle

Area and Circumference of a Circle

Saturday, June 25, 2011

Principal Component Analysis (PCA) Demystified

 

Dimensionality reduction using PCA is typically taught using a cooking recipe such as the following:

image

Have you ever wondered where this recipe originated from. Below is the underlying proof in detail:

image

image

image

image

image

image

image

Why is integral of normal distribution = 1 ? Where does the mysterious normalizing constant come from ?

 

The case of the Univariate Normal Distribution

Univariate_Normal_Dist_Integral_1

 

Univariate_Normal_Dist_Integral_2

The case of the Multivariate Normal Distribution

Multivariate_Normal_Dist_Integral_1Multivariate_Normal_Dist_Integral_2

Multivariate_Normal_Dist_Integral_4

Multivariate_Normal_Dist_Integral_5