Browsing by Author "Willett, Rebecca"
Now showing 1 - 10 of 10
Results Per Page
Sort Options
Item CORT: Classification Or Regression Trees(2003-06-20) Scott, Clayton; Willett, Rebecca; Nowak, Robert David; Digital Signal Processing (http://dsp.rice.edu/)In this paper we challenge three of the underlying principles of CART, a well know approach to the construction of classification and regression trees. Our primary concern is with the penalization strategy employed to prune back an initial, overgrown tree. We reason, based on both intuitive and theoretical arguments, that the pruning rule for classification should be different from that used for regression (unlike CART). We also argue that growing a treestructured partition that is specifically fitted to the data is unnecessary. Instead, our approach to tree modeling begins with a nonadapted (fixed) dyadic tree structure and partition, much like that underlying multiscale wavelet analysis. We show that dyadic trees provide sufficient flexibility, are easy to construct, and produce near-optimal results when properly pruned. Finally, we advocate the use of a negative log-likelihood measure of empirical risk. This is a more appropriate empirical risk for non-Gaussian regression problems, in contrast to the sum-of-squared errors criterion used in CART regression.Item CORT: Classification Or Regression Trees(2003-04-20) Scott, Clayton; Willett, Rebecca; Nowak, Robert David; Digital Signal Processing (http://dsp.rice.edu/)In this paper we challenge three of the underlying principles of CART, a well know approach to the construction of classification and regression trees. Our primary concern is with the penalization strategy employed to prune back an initial, overgrown tree. We reason, based on both intuitive and theoretical arguments, that the pruning rule for classification should be different from that used for regression (unlike CART). We also argue that growing a treestructured partition that is specifically fitted to the data is unnecessary. Instead, our approach to tree modeling begins with a nonadapted (fixed) dyadic tree structure and partition, much like that underlying multiscale wavelet analysis. We show that dyadic trees provide sufficient flexibility, are easy to construct, and produce near-optimal results when properly pruned. Finally, we advocate the use of a negative log-likelihood measure of empirical risk. This is a more appropriate empirical risk for non-Gaussian regression problems, in contrast to the sum-of-squared errors criterion used in CART regression.Item Multiresolution Intensity Estimation of Piecewise Linear Poisson Processes(2001-04-20) Willett, Rebecca; Digital Signal Processing (http://dsp.rice.edu/)Given observations of a one-dimensional piecewise linear, length-M Poisson intensity function, our goal is to estimate both the partition points and the parameters of each segment. In order to determine where the breaks lie, we develop a maximum penalized likelihood estimator based on information-theoretic complexity penalization. We construct a probabilistic model of the observations within a multiscale framework, and use this framework to devise a computationally efficient optimization algorithm, based on a tree-pruning approach, to compute the MPLE.Item Multiresolution Nonparametric Intensity and Density Estimation(2002-05-20) Willett, Rebecca; Nowak, Robert David; Digital Signal Processing (http://dsp.rice.edu/)This paper introduces a new multiscale method for nonparametric piecewise polynomial intensity and density estimation of point processes. Fast, piecewise polynomial, maximum penalized likelihood methods for intensity and density estimation are developed. The recursive partitioning scheme underlying these methods is based on multiscale likelihood factorizations which, unlike conventional wavelet decompositions, are very well suited to applications with point process data. Experimental results demonstrate that multiscale methods can outperform wavelet and kernel based density estimation methods.Item Multiscale Analysis for Intensity and Density Estimation(2002-04-20) Willett, Rebecca; Digital Signal Processing (http://dsp.rice.edu/)The nonparametric multiscale polynomial and platelet algorithms presented in this thesis are powerful new tools for signal and image denoising and reconstruction. Unlike traditional wavelet-based multiscale methods, these algorithms are both well suited to processing Poisson and multinomial data and capable of preserving image edges. At the heart of these new algorithms lie multiscale signal decompositions based on polynomials in one dimension and multiscale image decompositions based on platelets in two dimensions. This thesis introduces platelets, localized atoms at various locations, scales and orientations that can produce highly accurate, piecewise linear approximations to images consisting of smooth regions separated by smooth boundaries. Polynomial- and platelet-based maximum penalized likelihood methods for signal and image analysis are both tractable and computationally efficient. Simulations establish the practical effectiveness of these algorithms in applications such as medical and astronomical, density estimation, and networking; statistical risk bounds establish the theoretical near-optimality of these algorithms.Item Multiscale Density Estimation(2003-08-20) Willett, Rebecca; Nowak, Robert David; Digital Signal Processing (http://dsp.rice.edu/)The nonparametric density estimation method proposed in this paper is computationally fast, capable of detecting density discontinuities and singularities at a very high resolution, spatially adaptive, and offers near minimax convergence rates for broad classes of densities including Besov spaces. At the heart of this new method lie multiscale signal decompositions based on piecewise-polynomial functions and penalized likelihood estimation. Upper bounds on the estimation error are derived using an information-theoretic risk bound based on squared Hellinger loss. The method and theory share many of the desirable features associated with wavelet-based density estimators, but also offers several advantages including guaranteed non-negativity, bounds on the L1 error, small-sample quantification of the estimation errors, and additional flexibility and adaptability. In particular, the method proposed here can adapt the degrees as well as the locations of the polynomial pieces. For a certain class of densities, the error of the variable degree estimator converges at nearly the parametric rate. Experimental results demonstrate the advantages of the new approach compared to traditional density estimators and wavelet-based estimators.Item Multiscale Likelihood Analysis and Image Reconstruction(2003-08-20) Willett, Rebecca; Nowak, Robert David; Digital Signal Processing (http://dsp.rice.edu/)The nonparametric multiscale polynomial and platelet methods presented here are powerful new tools for signal and image denoising and reconstruction. Unlike traditional wavelet-based multiscale methods, these methods are both well suited to processing Poisson or multinomial data and capable of preserving image edges. At the heart of these new methods lie multiscale signal decompositions based on polynomials in one dimension and multiscale image decompositions based on what the authors call platelets in two dimensions. Platelets are localized functions at various positions, scales and orientations that can produce highly accurate, piecewise linear approximations to images consisting of smooth regions separated by smooth boundaries. Polynomial and platelet-based maximum penalized likelihood methods for signal and image analysis are both tractable and computationally efficient. Polynomial methods offer near minimax convergence rates for broad classes of functions including Besov spaces. Upper bounds on the estimation error are derived using an information-theoretic risk bound based on squared Hellinger loss. Simulations establish the practical effectiveness of these methods in applications such as density estimation, medical imaging, and astronomy.Item Platelets for Multiscale Analysis in Medical Imaging(2002-04-20) Willett, Rebecca; Nowak, Robert David; Digital Signal Processing (http://dsp.rice.edu/)This paper describes the development and use of multiscale, platelet-based image reconstruction algorithms in medical imaging. Such algorithms are effective because platelets approximate images in certain (piecewise) smoothness classes significantly more efficiently than sinusoids, wavelets, or wedgelets. Platelet representations are especially well-suited to the analysis of Poisson data, unlike most other multiscale image representations, and they can be rapidly computed. We present a fast, platelet-based maximum penalized likelihood algorithm that encompasses denoising, deblurring, and tomographic reconstruction and its applications to photon-limited imaging.Item Platelets for Multiscale Analysis in Photon-Limited Imaging(2002-09-20) Willett, Rebecca; Nowak, Robert David; Digital Signal Processing (http://dsp.rice.edu/)This paper proposes a new multiscale image decomposition based on platelets. Platelets are localized functions at various scales, locations, and orientations that produce piecewise linear image approximations. For smoothness measured in certain H¨older classes, the error of m-term platelet approximations can decay significantly faster than that of m-term approximations in terms of sinusoids, wavelets, or wedgelets. Platelet representations are especially well suited for the analysis of Poisson data, unlike most other multiscale image representations, and they can be rapidly computed. We propose a platelet-based maximum penalized likelihood criterion that encompasses denoising, deblurring, and tomographic reconstruction.Item Platelets: A Multiscale Approach for Recovering Edges and Surfaces in Photon-Limited Medical Imaging(2003) Willett, Rebecca; Nowak, Robert David; Digital Signal Processing (http://dsp.rice.edu/)This paper proposes a new multiscale image decomposition based on platelets. Platelets are localized functions at various scales, locations, and orientations that produce piecewise linear image approximations. Platelets are well suited for approximating images consisting of smooth regions separated by smooth boundaries. For smoothness measured in certain Holder classes, it is shown that the error of m-term platelet approximations can decay significantly faster than that of m-term approximations in terms of sinusoids, wavelets, or wedgelets. This suggests that platelets may outperform existing techniques for image denoising and reconstruction. Moreover, the platelet decomposition is based on a recursive image partitioning scheme which, unlike conventional wavelet decompositions, is very well suited to photon-limited medical imaging applications involving Poisson distributed data. Fast, platelet-based, maximum penalized likelihood methods for photon-limited image denoising, deblurring and tomographic reconstruction problems are developed. Because platelet decompositions of Poisson distributed images are tractable and computationally efficient, existing image reconstruction methods based on expectation-maximization type algorithms can be easily enhanced with platelet techniques. Experimental results demonstrate that platelet-based methods can outperform standard reconstruction methods currently in use in confocal microscopy, image restoration and emission tomography.