Processing math: 100%

 

 

 

Basic Steps of AdaBoost

With the above definitions we are now ready to set up the algorithm for AdaBoost. The basic idea is to set up weights which will be used to scale the correctly classified and the misclassified cases.

  1. We start by initializing all weights to wi=1/n, with i=0,1,2,n1. It is easy to see that we must have n1i=0wi=1.
  2. We rewrite the misclassification error as
¯errm=n1i=0wmiI(yiG(xi))n1i=0wi,
  1. Then we start looping over all attempts at classifying, namely we start an iterative process for m=1:M, where M is the final number of classifications. Our given classifier could for example be a plain decision tree.
    1. Fit then a given classifier to the training set using the weights wi.
    2. Compute then err and figure out which events are classified properly and which are classified wrongly.
    3. Define a quantity αm=log(1¯errm)/¯errm
    4. Set the new weights to wi=wi×exp(αmI(yiG(xi).
  2. Compute the new classifier G(x)=n1i=0αmI(yiG(xi).

For the iterations with m2 the weights are modified individually at each steps. The observations which were misclassified at iteration m1 have a weight which is larger than those which were classified properly. As this proceeds, the observations which were difficult to classifiy correctly are given a larger influence. Each new classification step m is then forced to concentrate on those observations that are missed in the previous iterations.