next up previous
Next: The Cascade of Classifiers Up: Learning Classification Functions Previous: Classification and Regression Trees

Gentle Ada Boost for CARTs

The Gentle Ada Boost Algorithm is a variant of the powerful boosting learning technique [4]. It is used to select a set of simple CARTs to achieve a given detection and error rate. In the following, a detection is referred to as a hit and an error as a false alarm. The various Ada Boost algorithms differ in the update scheme of the weights. According to Lienhart et al. the Gentle Ada Boost Algorithm is the most successful learning procedure tested for face detection applications [9,8].

The learning is based on $ N$ weighted training examples $ (x_1,y_1), \ldots, (x_N,y_N)$, where $ x_i$ are the images and $ y_i \in \{-1,1\}, i \in \{1,\ldots,N\}$ the classified output. At the beginning of the learning phase the weights $ w_i$ are initialized with $ w_i = 1/N$. The following three steps are repeated to select simple CARTs until a given detection rate $ d$ is reached:

  1. Every simple classifier, i.e., a CART, is fit to the data. Hereby the error $ e$ is calculated with respect to the weights $ w_i$.
  2. The best CART $ h_t$ is chosen for the classification function. The counter $ t$ is incremented.
  3. The weights are updated with $ w_i := w_i \cdot e^{-y_i h_t(x_i)}$ and renormalized.

The final output of the classifier is sign$ (\sum_{t=1}^T
h_t(x)) > 0 $, with $ h(x)$ the weighted return value of the CART. Next, a cascade based on these classifiers is built.


next up previous
Next: The Cascade of Classifiers Up: Learning Classification Functions Previous: Classification and Regression Trees
root 2004-08-31