Browsing by Author "Lavu, Sridhar"
Now showing 1 - 7 of 7
Results Per Page
Sort Options
Item 3D Geometry Coding using Mixture Models and the Estimation Quantization Algorithm(2002-09-01) Lavu, Sridhar; Lavu, Sridhar; Digital Signal Processing (http://dsp.rice.edu/)3D surfaces are used in applications such as animations, 3D object modeling and visualization. The geometries of such surfaces are often approximated using polygonal meshes. This thesis aims to compress 3D geometry meshes by using an algorithm based on normal meshes and the Estimation-Quantization (EQ) algorithm. Normal meshes are multilevel representations where finer level vertices lie in a direction normal to the local surface and therefore compress the vertex data to one scalar value per vertex. A mixture distribution model is used for the wavelet coefficients. The EQ algorithm uses the local neighborhood information and Rate-Distortion optimization to encode the wavelet coefficients. We achieve performance gains of 0.5-1dB compared to the zerotree coder for normal meshes.Item Estimation-Quantization Geometry Coding Using Normal Meshes(2003-03-01) Lavu, Sridhar; Choi, Hyeokho; Baraniuk, Richard G.; Digital Signal Processing (http://dsp.rice.edu/)We propose a new algorithm for compressing three-dimensional triangular mesh data used for representing surfaces. We apply the Estimation-Quantization (EQ) algorithm originally designed for still image compression to the normal mesh wavelet coefficients. The EQ algorithm models the wavelet coefficients as a Gaussian random field with slowly varying standard deviation. By designing the quantizers in a rate-distortion optimal fashion, we improve upon the recently proposed zerotree normal mesh compression algorithm by 0.5 to 1 dB in distortion.Item Geometry Compression of Normal Meshes using Rate-Distortion Algorithms(2003-06-01) Lavu, Sridhar; Choi, Hyeokho; Baraniuk, Richard G.; Digital Signal Processing (http://dsp.rice.edu/)We propose a new rate-distortion based algorithm for compressing 3D surface geometry represented using triangular normal meshes. We apply the Estimation-Quantization (EQ) algorithm to compress normal mesh wavelet coefficients. The EQ algorithm models the wavelet coefficients as a Gaussian random field with slowly varying standard deviation that depends on the local neighborhood and uses rate-distortion optimal scalar quantizers. We achieve gains of 0.5 to 1 dB with the EQ algorithm compared to the recently proposed zerotree compression for normal meshes.Item Multiscale Approximation of Piecewise Smooth Two-Dimensional Function using Normal Triangulated Meshes(2005-07-01) Jansen, Maarten; Baraniuk, Richard G.; Lavu, Sridhar; Digital Signal Processing (http://dsp.rice.edu/)Multiresolution triangulation meshes are widely used in computer graphics for representing three-dimensional(3-d) shapes. We propose to use these tools to represent 2-d piecewise smooth functions such as grayscale images,because triangles have potential to more efficiently approximate the discontinuities between the smooth pieces than other standard tools like wavelets. We show that normal mesh subdivision is an efficient triangulation, thanks to its local adaptivity to the discontinuities. Indeed, we prove that, within a certain function class, the normal mesh representation has an optimal asymptotic error decay rate as the number of terms in the representation grows. This function class is the so-called horizon class comprising constant regions separated by smooth discontinuities,where the line of discontinuity is C2 continuous. This optimal decay rate is possible because normal meshes automatically generate a polyline (piecewise linear) approximation of each discontinuity, unlike the blocky piecewise constant approximation of tensor product wavelets. In this way, the proposed nonlinear multiscale normal mesh decomposition is an anisotropic representation of the 2-d function. The same idea of anisotropic representations lies at the basis of decompositions such as wedgelet and curvelet transforms, but the proposed normal mesh approach has a unique construction.Item Multiscale Image Processing Using Normal Triangulated Meshes(2001-10-01) Jansen, Maarten; Choi, Hyeokho; Lavu, Sridhar; Baraniuk, Richard G.; Digital Signal Processing (http://dsp.rice.edu/)Multiresolution triangulation meshes are widely used in computer graphics for 3-d modelling of shapes. We propose an image representation and processing framework using a multiscale triangulation of the grayscale function. Triangles have the potential of approximating edges better than the blocky structures of tensor-product wavelets. Among the many possible triangulation schemes, normal meshes are natural for efficiently representing singularities in image data thanks to their adaptivity to the smoothness of the modeled image. Our non-linear, multiscale image decomposition algorithm, based on this subdivision scheme, takes edges into account in a way that is closely related to wedgelets and curvelets. The highly adaptive property of the normal meshes construction provides a very efficient representation of images, which potentially outperforms standard wavelet transforms. We demonstrate the approximation performance of the normal mesh representation through mathematical analyses for simple functions and simulations for real images.Item Three-dimensional geometric coding using mixture models and the estimation quantization algorithm(2003) Lavu, Sridhar; Baraniuk, Richard G.3D surfaces are used in applications such as animations, 3D object modeling and visualization. The geometries of such surfaces are often approximated using polygonal meshes. This thesis aims to compress 3D geometry meshes by using an algorithm based on normal meshes and the Estimation-Quantization (EQ) algorithm. Normal meshes are multilevel representations where finer level vertices lie in a direction normal to the local surface and therefore compress the vertex data to one scalar value per vertex. A mixture distribution model is used for the wavelet coefficients. The EQ algorithm uses the local neighborhood information and Rate-Distortion optimization to encode the wavelet coefficients. We achieve performance gains of 0.5--1dB compared to the zerotree coder for normal meshes.Item Volume visualization and volume painting of large data sets(2009) Lavu, Sridhar; Warren, JoeVolume visualization of large volume data sets is traditionally based on surface fitting algorithms. With recent advances in graphics computation and memory capabilities, direct volume rendering techniques to produce interactive volume rendering for large data sets have become feasible. We implement a custom texture based volume rendering algorithm that takes advantage of the advanced graphics hardware capabilities. Volume masking is used in volume visualization to visualize and distinguish the different regions of interest in a volume data set. Generating volume masks is a tedious and time intensive process. We propose a new approach for generating volume masks using a volume painting approach. The volume masking relies on a material mapping to map the different mask values in the volume mask to different regions and colors. We build a custom cross-platform visualization application to support interactive volume visualization and volume painting for large volume data sets.