Browsing by Author "Chauhan, Arun"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Domain-Specific Type Inference for Library Generation in a Telescoping Compiler(2004-04-14) Chauhan, Arun; Kennedy, Ken; McCosh, CherylTelescoping languages is a strategy for allowing users to develop code in high-level, domain-specific languages and still achieve high performance. It uses extensive offline processing of the library defining the language. This process speculatively determines the possible uses of the library subroutines and generates variants specialized toward those uses. LibGen is a telescoping-language system for generating high-performance Fortran or C libraries with multiple specialized variants from a single version of MATLAB prototype code. LibGen uses variable types to guide specialization. Previously, we have shown that the generated code has comparable performance to hand-coded and optimized Fortran libraries and that specialization on type is important for achieving high performance. In this paper, we describe the type inference system necessary for LibGen to speculate on the possible variants of library procedures and to generate code. We develop the concept of type jump-functions, which describe the transfer of type information through and across procedures. To compute these type jump-functions, we develop a static type-inference approach that uses a constraint-based formulation and a graph-theoretical algorithm shown to be efficient under conditions met in most practical cases.Item Scalability and Data Placement on SGI Origin(1998-04-01) Chauhan, Arun; Ding, Chen; Sheraw, BerryCache-coherent non-uniform memory access (ccNUMA) architectures have attracted lots of academic and industry interests as a promising direction to large scale parallel computing. Data placement has been used as a major optimization method on such machines. This study examined the scalability and the effect of data placement on a state-of-the-art ccNUMA machine, SGI Origin, using 16 processors. Three applications from SPLASH-2 are used, FFT, Radix and Barnes-Hut. The results showed that FFT and Radix cannot scale to 16 processors with 70% efficiency even for the largest data sizes tested. Barnes-Hut doesn't scale for small data size but scales linearly for large input size. The results also showed that data placement does not make any difference on performance for all three applications. We attribute these results to the effect of the advanced uni-processor used on the Origin, R10K, the optimizing compiler, and the aggressive communication architecture. Some of our results are quite different from the predictions of two recent simulation studies on directory-based ccNUMA machines (Holt:ISCA96) and (Pai:HPCA97), especially on FFT. These differences are partly due to the fact that the machine models used in previous simulation studies are different from the Origin machine in some important aspects. Our results also include data sizes that are larger than any of the previous simulation studies. To increase our confidence on the latency numbers and data placement tools, we also measured memory latencies using micro-benchmarks.Item Telescoping MATLAB for DSP Applications(2003-12-05) Chauhan, ArunThis dissertation designs and implements a prototype MATLAB compiler for Digital Signal Processing (DSP) libraries, based on a novel approach called telescoping languages for compiling high-level languages. The thesis of this work is that it is possible to effectively and efficiently compile DSP libraries written in MATLAB using the telescoping languages approach that aims to automatically develop domain specific application development environments based on component libraries for high performance computing. Initial studies on DSP applications demonstrated that the approach was promising. During this study two new techniques, procedure strength reduction and procedure vectorization, were developed. In a joint work, a new approach to MATLAB type inference was developed. The inferred type information can be used to specialize MATLAB libraries and generate code in C or Fortran. A new technique to engineer the optimizing compiler emerged during the course of the compiler development. This technique allows the optimizations of interest to be expressed in an XML-based language and the optimizer in the compiler to be alight-weight specialization engine. The type inference engine and type-based specialization were evaluated on a set of DSP procedures that constitute an informal library used by researchers in the Electrical and Computer Engineering department at Rice. The evaluation validated the effectiveness of the library generation strategy driven by specialization.Item Telescoping MATLAB for DSP applications(2004) Chauhan, Arun; Kennedy, KenThis dissertation designs and implements a prototype MATLAB compiler for Digital Signal Processing (DSP) libraries, based on a novel approach called telescoping languages for compiling high-level languages. The thesis of this work is that it is possible to effectively and efficiently compile DSP libraries written in MATLAB using the telescoping languages approach that aims to automatically develop domain-specific application development environments based on component libraries for high performance computing. Initial studies on DSP applications demonstrated that the approach was promising. During this study two new techniques, procedure strength reduction and procedure vectorization, were developed. In a joint work, a new approach to MATLAB type inference was developed. The inferred type information can be used to specialize MATLAB libraries and generate code in C or Fortran. A new technique to engineer the optimizing compiler emerged during the course of the compiler development. This technique allows the optimizations of interest to be expressed in an XML-based language and the optimizer in the compiler to be a light-weight specialization engine. The type inference engine and type-based specialization were evaluated on a set of DSP procedures that constitute an informal library used by researchers in the Electrical and Computer Engineering department at Rice. The evaluation validated the effectiveness of the library generation strategy driven by specialization.Item Type-Based Speculative Specialization in a Telescoping Compiler for MATLAB(2003-01-17) Chauhan, Arun; Kennedy, Ken; McCosh, CherylTelescoping languages is a strategy to automatically generate highly-optimized domain-specific libraries. The key idea is to create specialized variants of library procedures through extensive offline processing. This paper describes a telescoping system, called ARGen, which generates high-performance Fortran or C libraries from prototype Matlab code for the linear algebra library, ARPACK. ARGen uses variable types to guide procedure specializations on possible calling contexts. We show that type-based specializations of generated libraries can lead to more than 50% speedup. ARGen needs to infer Matlab types in order to speculate on the possible variants of library procedures, as well as to generate code. This paper develops an approach combining static and dynamic type inference that includes a graph-theoretic algorithm that is shown to be efficient under a set of conditions that are easily met for most practical cases. The ideas developed here provide a basis for building a more general telescoping system for Matlab.