Consistency for L2Boosting and Matching Pursuit with Trees and Tree-type Basis Functions

Peter Bühlmann

October 2002

Abstract

We present new consistency results in regression and classification for L2Boosting, a powerful variant of boosting with the squared error loss function. For any dimension of the predictor, a square-integrable regression or an arbitrary conditional probability function, potentially discontinuous, can be consistently estimated with L2Boosting using tree-type learners. We also discuss close connections to matching pursuits for basis functions in signal processing and demonstrate differences between tree and rectangle indicator basis functions. Depending on the signal to noise ratio, one of them will be better than the other and we thus get additional flexibility to tune boosting to high or low noise problems.

Keywords: basis selection, Bayes risk consistency, boosting, multiple decision trees, nonparametric classification, nonparametric regression

Compressed Postscript (214 Kb)
PDF (404 Kb)


Go back to the Research Reports from Seminar für Statistik.