2006 UnbiasedRecursivePartitioningAC

From GM-RKB
(Redirected from Hothorn et al., 2006)
Jump to navigation Jump to search

Subject Headings: Regression Tree Algorithm, ctree System.

Notes

Cited By

Quotes

Abstract

Recursive binary partitioning is a popular tool for regression analysis. Two fundamental problems of exhaustive search procedures usually applied to fit such models have been known for a long time: Overfitting and a selection bias towards covariates with many possible splits or missing values. While pruning procedures are able to solve the overfitting problem, the variable selection bias still seriously effects the interpretability of tree-structured regression models. For some special cases unbiased procedures have been suggested, however lacking a common theoretical foundation. We propose a unified framework for recursive partitioning which embeds tree-structured regression models into a well defined theory of conditional inference procedures. Stopping criteria based on multiple test procedures are implemented and it is shown that the predictive performance of the resulting trees is as good as the performance of established exhaustive search procedures. It turns out that the partitions and therefore the models induced by both approaches are structurally different, indicating the need for an unbiased variable selection. The methodology presented here is applicable to all kinds of regression problems, including nominal, ordinal, numeric, censored as well as multivariate response variables and arbitrary measurement scales of the covariates. Data from studies on animal abundance, glaucoma classification, node positive breast cancer and mammography experience are re-analyzed.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2006 UnbiasedRecursivePartitioningACTorsten Hothorn
Kurt Hornik
Achim Zeileis
Unbiased Recursive Partitioning: A Conditional Inference Framework10.1198/106186006X1339332006