Next: Application of the Cascade
Up: Object Classification
Previous: Learning Classification Functions
The performance of one classifier is not suitable for object
classification, since it produces a high hit rate, e.g., 0.999,
and error rate, e.g., 0.5. Nevertheless the hit rate is much
higher than the error rate. To construct an overall good
classifier, several classifiers are arranged in a cascade, i.e.,
a degenerated decision tree. In every stage of the cascade a
decision is made whether the image contains the object or not.
This computation reduces both rates. Since the hit rate is close
to one, their multiplication results also in a value close to
one, while the multiplication of the smaller error rates
approaches zero. Furthermore the whole classification process
speeds up.
Figure 3 shows an example cascade of classifiers for
detecting an ``office chair'' in depth images.
Figure:
The first three stages of a cascade of
classifiers to detect an office chair in depth data. Every
stage contains several simple classifiers that use Haar-like
features.
|
An overall effective cascade is learned by a simple iterative
method. For every stage the classification function is
learned, until the required hit rate is reached. The process
continues with the next stage using only the currently
misclassified negative examples. The number of features used in
each classifiers increases with additional stages (Figure 3).
Next: Application of the Cascade
Up: Object Classification
Previous: Learning Classification Functions
root
2004-03-04