Title: AdaTree: Boosting a Weak Classifier into a Decision Tree
Abstract:We present a boosting method that results in a decision tree rather than a fixed linear sequence of classifiers. An equally correct statement is that we present a tree-growing method whose performance...We present a boosting method that results in a decision tree rather than a fixed linear sequence of classifiers. An equally correct statement is that we present a tree-growing method whose performance can be analysed in the framework of Adaboost. We argue that Adaboost can be improved by presenting the input to a sequence of weak classifiers, each one tuned to the conditional probability determined by the output of previous weak classifiers. As a result, the final classifier has a tree structure, rather than being linear, thus the name "Adatree". One of the consequences of the tree structure is that different input data may have different processing time. Early experimentation shows a reduced computation cost with respect to Adaboost. One of our intended applications is real-time detection, where cascades of boosted detectors have recently become successful. The reduced computation cost of the proposed method shows some potential for being used directly in detection problems, without need of a cascade.Read More
Publication Year: 2005
Publication Date: 2005-04-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 32
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot