Showing posts with label Machine Learning. Show all posts
Showing posts with label Machine Learning. Show all posts

Friday, October 21, 2011

Why is the error function minimized in logistic regression convex?

We want to prove that the error/objective function of logistic regression :

is convex.

Proof:

Before beginning the proof, i would first like to make you review/recollect a few definitions/rules/facts/results related to convex functions:


  • Definition of a convex function: A function f(x) is said to be convex if the following inequality holds true:


  • First-order condition of convexity: A function f(x) which is differentiable is convex if the following inequality condition holds true:

Intuitively, this condition says that the tangent/first-order-taylor-series approximation of f(x) is globally an under-estimator.

  • Second-order condition of convexity: A function f(x) which is twice-differentiable is convex if and only if its hessian matrix (matrix of second-order partial derivatives) is positive semi-definite, i.e.

     
  • Sum/Linear-combination of two or more convex functions is also convex: Let f(x) and g(x) be two convex functions. Then any linear combination of these two functions



    is also a convex function (this can be easily proved using the definition of the convex function).

Now notice that if we can prove that the two functions



are convex, then our objective function



must also be convex since any linear combination of two or more convex functions is also convex.

Let us now try to prove that



is a convex function of theta. In order to do this, we will use the second-order condition of convexity described above. Let us first compute the hessian matrix:

Now below is the proof that this hessian matrix is positive semi-definite:

Let us now try to prove that



is a convex function of theta. In order to do this, we will again use the second-order condition of convexity described above. Let us first compute its hessian matrix:

Above, we have proved that both

are convex functions. And, the error/objective function of logistic regression

is essentially a linear-combination of several such convex functions. Now, since a linear combination of two or more convex functions is convex, we conclude that the objective function of logistic regression is convex.

Hence proved …

Following the same line of approach/argument it can be easily proven that the objective function of logistic regression is convex even if regularization is used.