CORT: Classification Or Regression Trees

dc.citation.bibtexNameinproceedingsen_US
dc.citation.conferenceNameIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP)en_US
dc.contributor.authorScott, Claytonen_US
dc.contributor.authorWillett, Rebeccaen_US
dc.contributor.authorNowak, Robert Daviden_US
dc.contributor.orgDigital Signal Processing (http://dsp.rice.edu/)en_US
dc.date.accessioned2007-10-31T01:04:46Z
dc.date.available2007-10-31T01:04:46Z
dc.date.issued2003-04-20en
dc.date.modified2003-01-29en_US
dc.date.note2003-01-29en_US
dc.date.submitted2003-04-20en_US
dc.descriptionConference Paperen_US
dc.description.abstractIn this paper we challenge three of the underlying principles of CART, a well know approach to the construction of classification and regression trees. Our primary concern is with the penalization strategy employed to prune back an initial, overgrown tree. We reason, based on both intuitive and theoretical arguments, that the pruning rule for classification should be different from that used for regression (unlike CART). We also argue that growing a treestructured partition that is specifically fitted to the data is unnecessary. Instead, our approach to tree modeling begins with a nonadapted (fixed) dyadic tree structure and partition, much like that underlying multiscale wavelet analysis. We show that dyadic trees provide sufficient flexibility, are easy to construct, and produce near-optimal results when properly pruned. Finally, we advocate the use of a negative log-likelihood measure of empirical risk. This is a more appropriate empirical risk for non-Gaussian regression problems, in contrast to the sum-of-squared errors criterion used in CART regression.en_US
dc.description.sponsorshipOffice of Naval Researchen_US
dc.description.sponsorshipArmy Research Officeen_US
dc.description.sponsorshipNational Science Foundationen_US
dc.identifier.citationC. Scott, R. Willett and R. D. Nowak, "CORT: Classification Or Regression Trees," 2003.
dc.identifier.doihttp://dx.doi.org/10.1109/ICASSP.2003.1201641en_US
dc.identifier.urihttps://hdl.handle.net/1911/20343
dc.language.isoeng
dc.subjectclassification*
dc.subjectmultiscale*
dc.subjectrisk*
dc.subjectCART*
dc.subject.keywordclassificationen_US
dc.subject.keywordmultiscaleen_US
dc.subject.keywordrisken_US
dc.subject.keywordCARTen_US
dc.subject.otherMultiscale Methodsen_US
dc.titleCORT: Classification Or Regression Treesen_US
dc.typeConference paper
dc.type.dcmiText
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Sco2003Apr5CORTClass.PDF
Size:
399.39 KB
Format:
Adobe Portable Document Format