Loading [MathJax]/extensions/TeX/boldsymbol.js

 

 

 

What is boosting? Additive Modelling/Iterative Fitting

Boosting is a way of fitting an additive expansion in a set of elementary basis functions like for example some simple polynomials. Assume for example that we have a function

f_M(x) = \sum_{i=1}^M \beta_m b(x;\gamma_m),

where \beta_m are the expansion parameters to be determined in a minimization process and b(x;\gamma_m) are some simple functions of the multivariable parameter x which is characterized by the parameters \gamma_m .

As an example, consider the Sigmoid function we used in logistic regression. In that case, we can translate the function b(x;\gamma_m) into the Sigmoid function

\sigma(t) = \frac{1}{1+\exp{(-t)}},

where t=\gamma_0+\gamma_1 x and the parameters \gamma_0 and \gamma_1 were determined by the Logistic Regression fitting algorithm.

As another example, consider the cost function we defined for linear regression

C(\boldsymbol{y},\boldsymbol{f}) = \frac{1}{n} \sum_{i=0}^{n-1}(y_i-f(x_i))^2.

In this case the function f(x) was replaced by the design matrix \boldsymbol{X} and the unknown linear regression parameters \boldsymbol{\beta} , that is \boldsymbol{f}=\boldsymbol{X}\boldsymbol{\beta} . In linear regression we can simply invert a matrix and obtain the parameters \beta by

\boldsymbol{\beta}=\left(\boldsymbol{X}^T\boldsymbol{X}\right)^{-1}\boldsymbol{X}^T\boldsymbol{y}.

In iterative fitting or additive modeling, we minimize the cost function with respect to the parameters \beta_m and \gamma_m .