next up previous
Next: The Cascade of Classifiers Up: Object Classification Previous: Feature Detection using Integral

Learning Classification Functions


The Gentle Ada Boost Algorithm is a variant of the powerful boosting leaning technique [#!Freund_1996!#]. It is used to select a set of simple features to achieve a given detection and error rate. In the following, a detection is referred as hit and an error as a false alarm. The various Ada Boost algorithms differ in the update scheme of the weights. According to Lienhart et al. the Gentle Ada Boost Algorithm is the most successful learning procedure tested for face detection applications [#!Lienhart_2003_1!#].

The learning is based on $N$ weighted training examples $(x_1,y_1), \ldots, (x_N,y_N)$, where $x_i$ are the images and $y_i \in \{-1,1\}, i \in \{1,\ldots,N\}$ the classified output. At the beginning of the learning phase the weights $w_i$ are initialized with $w_i = 1/N$. The following three steps are repeated to select simple features until a given detection rate $d$ is reached:

  1. Every simple classifier, i.e., a single feature, is fit to the data. Hereby the error $e$ is calculated with respect to the weights $w_i$.
  2. The best feature classifier $h_t$ is chosen for the classification function. The counter $t$ is increased.
  3. The weights are updated with $w_i := w_i \cdot e^{-y_i h_t(x_i)}$ and renormalized.

The final output of the classifier is $\mbox{sign} (\sum_{t=1}^T
h_t(x))$, with $h(x) = \alpha$, if $x \geq \mbox{thr.}$ and $h(x)
= \beta$ otherwise. $\alpha$ and $\beta$ are the output of the fitted simple feature classifiers, that depend on the assigned weights, the expected error and the classifier size. Next, a cascade based on these classifiers is built.



next up previous
Next: The Cascade of Classifiers Up: Object Classification Previous: Feature Detection using Integral
root 2004-03-04