Rice University Graduate Research
Permanent URI for this community
Browse
Browsing Rice University Graduate Research by Title
Now showing 1 - 20 of 47
Results Per Page
Sort Options
Item A Computational Model of Routine Procedural Memory(Rice University, 2009) Tamborello, Franklin Patrick, II; Byrne, Michael D.Cooper and Shallice (2000) implemented a computational version of the Norman and Shallice’s (1986) Contention Scheduling Model (CSM). The CSM is a hierarchically organized network of action schemas and goals. Botvinick and Plaut (2004) instead took a connectionist approach to modeling routine procedural behavior. They argued in favor of holistic, distributed representation of learned step co-occurrence associations. Two experiments found that people can adapt routine procedural behavior to changing circumstances quite readily and that other factors besides statistical co-occurrence can have influence on action selection. A CSM-inspired ACT-R model of the two experiments is the first to postdict differential error rates across multiple between-subjects conditions and trial types. Results from the behavioral and modeling studies favor a CSM-like theory of human routine procedural memory that uses discrete, hierarchically-organized goal and action representations that are adaptable to new but similar procedures.Item A Framework for Testing Concurrent Programs(Rice University, 2011) Ricken, Mathias; Cartwright, Robert S.This study proposes a new framework that can effectively apply unit testing to concurrent programs, which are difficult to develop and debug. Test-driven development, a practice enabling developers to detect bugs early by incorporating unit testing into the development process, has become wide-spread, but it has only been effective for programs with a single thread of control. The order of operations in different threads is essentially non-deterministic, making it more complicated to reason about program properties in concurrent programs than in single-threaded programs. Because hardware, operating systems, and compiler optimizations influence the order in which operations in different threads are executed, debugging is problematic since a problem often cannot be reproduced on other machines. Multicore processors, which have replaced older single-core designs, have exacerbated these problems because they demand the use of concurrency if programs are to benefit from new processors. The existing tools for unit testing programs are either flawed or too costly. JUnit, for instance, assumes that programs are single-threaded and therefore does not work for concurrent programs; ConTest and rstest predate the revised Java memory model and make incorrect assumptions about the operations that affect synchronization. Approaches such as model checking or comprehensive schedule- based execution are too costly to be used frequently. All of these problems prevent software developers from adopting the current tools on a large scale. The proposed framework (i) improves JUnit to recognize errors in all threads, a necessary development without which all other improvements are futile, (ii) places some restrictions on the programs to facilitate automatic testing, (iii) provides tools that reduce programmer mistakes, and (iv) re-runs the unit tests with randomized schedules to simulate the execution under different conditions and on different ma- chines, increasing the probability that errors are detected. The improvements and restrictions, shown not to seriously impede programmers, reliably detect problems that the original JUnit missed. The execution with randomized schedules reveals problems that rarely occur under normal conditions. With an effective testing tool for concurrent programs, developers can test pro- grams more reliably and decrease the number of errors in spite of the proliferation of concurrency demanded by modern processors.Item Action of the Mazur pattern up to topological concordance(arXiv, 2024) Manchester, AlexIn the '80s, Freedman showed that the Whitehead doubling operator acts trivally up to topological concordance. On the other hand, Akbulut showed that the Whitehead doubling operator acts nontrivially up to smooth concordance. The Mazur pattern is a natural candidate for a satellite operator which acts by the identity up to topological concordance but not up to smooth concordance. Recently there has been a resurgence of study of the action of the Mazur pattern up to concordance in the smooth and topological categories. Examples showing that the Mazur pattern does not act by the identity up to smooth concordance have been given by Cochran--Franklin--Hedden--Horn and Collins. In this paper, we give evidence that the Mazur pattern acts by the identity up to topological concordance. In particular, we show that two satellite operators $P_{K_0,\eta_0}$ and $P_{K_1,\eta_1}$ with $\eta_0$ and $\eta_1$ freely homotopic have the same action on the topological concordance group modulo the subgroup of $(1)$-solvable knots, which gives evidence that they act in the same way up to topological concordance. In particular, the Mazur pattern and the identity operator are related in this way, and so this is evidence for the topological side of the analogy to the Whitehead doubling operator. We give additional evidence that they have the same action on the full topological concordance group by showing that up to topological concordance they cannot be distinguished by Casson-Gordon invariants or metabelian $\rho$-invariants.Item Amplified Encounters at High Speed(Rice University, 2011) Sibley, Rebecca; Pope, AlbertThis thesis expands upon the dialogue between speed and architecture, investigating how architecture reinterprets the linear city, originally defined by the continuous fabric of the freeway and more recently reconfigured by the high speed rail line. Using the linear city as a site of exploration and high speed rail as a ground to test new typologies of architectural insertions at amplified speed, this thesis produces an extended civic space along the proposed high speed rail line connecting Tampa and Orlando. Combining a series of performance and commercial programs, this new typology will make the obscured visual experience along the extended territory of the rail line legible, through a sequencing of specific architectural intersections, exploring how monumental civic space will be made and occupied in the sprawl of the American city.Item Application of a fully polynomial randomized approximation scheme (FPRAS) to infrastructure system reliability assessments(8/6/2017) Fu, Bowen; Dueñas-Osorio, LeonardoNetworked systems make the reliability assessment of critical infrastructure computationally challenging given the combinatorial nature of system-level states. Several methods from numerical schemes to analytical approaches, such as Monte Carlo Simulation (MCS) and recursive decomposition algorithms (RDA), respectively, have been applied to this stochastic network problem. Despite progress over several decades, the problem remains open because of its intrinsic computational complexity. As the structural facilities of infrastructure systems continue to in terconnect in network forms, their study steers analysts to develop system reliability assessment methods based on graph theory and network science. A fully polynomial randomized approximation scheme (FPRAS) based on Karger’s graph contraction algorithm is an approximating method for reliability evaluation, which has a unique property rarely exploited in engineering reliability: that by performing a number of experiments in polynomial time (as a function of system size), it provides an a priori theoretical guarantee that the reliability estimate falls into the ϵ-neighborhood of its true value with (1−δ) confidence. We build upon the FPRAS ideas to develop an s-t reliability version that has practical appeal. Focusing on the relevant-cut enumeration stage of the FPRAS, we find correlations between the recurrence frequencies of links in minimum cuts within the randomization phase of the contraction algorithm, and typical network topological properties. We employ LASSO regression analysis to approximate the relationship between link recurrence frequencies and such topological metrics. With the topology-informed link recurrence frequencies, obtained at a much lower computational cost, we use a new biased contraction probability yielding 16.9% more distinct minimum cuts (MinCuts) than the original random contraction scheme. The biased contraction scheme proposed here can significantly improve the efficiency of reliability evaluation of networked infrastructure systems, while supporting infrastructure systems design, maintenance and restoration given its ability to offer error guarantees, which are ideal for future prescriptive guidelines in practice.Item Autocorrelation Reflectivity of Mars(Wiley, 2020) Deng, Sizhuang; Levander, AlanThe seismic structure of the Martian interior can shed light on the formation and dynamic evolution of the planet and our solar system. The deployment of the seismograph carried by the InSight mission provides a means to study Martian internal structure. We used ambient noise autocorrelation to analyze the available vertical component seismic data to recover the reflectivity beneath the Insight lander. We identify the noise that is approximately periodic with the Martian sol as daily lander operations and the diurnal variation in Martian weather and tides. To investigate the seismic discontinuities at different depths, the autocorrelograms are filtered and stacked into different frequency bands. We observe prominent reflection signals probably corresponding to the Martian Moho, the olivine-wadsleyite transition in the mantle, and the core-mantle boundary in the stacked autocorrelograms. We estimate the depths of these boundaries as ~35, 1,110–1,170, and 1,520–1,600 km, consistent with other estimates.Item Book review: Barnstorming the Prairies: How Aerial Vision Shaped the Midwest, by Jason Weems(Rice University, 2018) LaFlamme, Marcel; University of Nebraska PressItem Book review: Grégoire Chamayou, A Theory of the Drone(Rice University, 2016-11) LaFlamme, Marcel; SAGEItem Carbon Nanotubes Filled Polymer Composites: A Comprehensive Study on Improving Dispersion, Network Formation and Electrical Conductivity(Rice University, 2010) Chakravarthi, Divya Kannan; Barrera, Enrique V.In this dissertation, we determine how the dispersion, network formation and alignment of carbon nanotubes in polymer nanocomposites affect the electrical properties of two different polymer composite systems: high temperature bismaleimide (BMI) and polyethylene. The knowledge gained from this study will facilitate optimization of the above mentioned parameters, which would further enhance the electrical properties of polymer nanocomposites. BMI carbon fiber composites filled with nickel-coated single walled carbon nanotubes (Ni-SWNTs) were processed using high temperature vacuum assisted resin transfer molding (VARTM) to study the effect of lightning strike mitigation. Coating the SWNTs with nickel resulted in enhanced dispersions confirmed by atomic force microscopy (AFM) and dynamic light scattering (DLS). An improved interface between the carbon fiber and Ni-SWNTs resulted in better surface coverage on the carbon plies. These hybrid composites were tested for Zone 2A lightning strike mitigation. The electrical resistivity of the composite system was reduced by ten orders of magnitude with the addition of 4 weight percent Ni-SWNTs (calculated with respect to the weight of a single carbon ply). The Ni-SWNTs - filled composites showed a reduced amount of damage to simulated lightning strike compared to their unfilled counterparts indicated by the minimal carbon fiber pull out. Methods to reduce the electrical resistivity of 10 weight percent SWNTs -- medium density polyethylene (MDPE) composites were studied. The composites processed by hot coagulation method were subjected to low DC electric fields (10 V) at polymer melt temperatures to study the effect of viscosity, nanotube welding, dispersion and, resultant changes in electrical resistivity. The electrical resistivity of the composites was reduced by two orders of magnitude compared to 10 wt% CNT-MDPE baseline. For effective alignment of SWNTs, a new process called Electric field Vacuum Spray was devised to overcome viscosity within the dispersed nanotube polymer system, and produce conductive MDPE-SWNT thin films. Polarized Raman spectroscopy and scanning electron microscopy (SEM) analysis on the samples showed an improvement in SWNT -- SWNT contacts and alignment in the polymer matrix. The resistivity of the samples processed by this new method was two order magnitudes lower than the samples processed by hot coagulation method subjected to electric field.Item Complicated Desires: Yto Barrada(Rice University, 2007-10-04) Hooper, Rachel; Walker Art CenterArtist entry on Yto Barrada in "Brave New Worlds" catalogueItem Conflict of Employee–Employer Interest: Introducing an Optimal Work Happiness Framework(Rice University, 2012) Conlon, Paul M.This qualitative study seeks to explore the pursuit and achievement of work happiness for employees. Conflicting interests between organizations (i.e., employers) and employees provide insights into the difficulties employees face in their quest for work happiness. Assumptions are visited concerning constraints on achieving work happiness due to organizational hierarchy. A framework, postulated by the author, lacks quantitative measurement, yet offers an encompassing, simple-enough model for further investigation. The optimal work happiness model hypothesizes that individuals in the workplace desire to feel valued, to believe others value their contributions, to find work meaningful, and to enjoy the work they do. Moreover, when employers genuinely believe workers contribute value to the organization, they may communicate this appreciation to them, thus causing employees to derive increased work happiness. Respectively, each individual must assess the value of his or her own employment identity and determine whether or not work is meaningful and enjoyable for himself or herself. Future quantitative research is needed to best reinforce the constructs of this qualitative study.Item Cooperative Partial Detection for MIMO Relay Networks(Rice University, 2011) Amiri, Kiarash; Cavallaro, Joseph R.Cooperative communication has recently re-emerged as a possible paradigm shift to realize the promises of the ever increasing wireless communication market; how- ever, there have been few, if any, studies to translate theoretical results into feasi- ble schemes with their particular practical challenges. The multiple-input multiple- output (MIMO) technique is another method that has been recently employed in different standards and protocols, often as an optional scenario, to further improve the reliability and data rate of different wireless communication applications. In this work, we look into possible methods and algorithms for combining these two tech- niques to take advantage of the benefits of both. In this thesis, we will consider methods that consider the limitations of practical solutions, which, to the best of our knowledge, are the first time to be considered in this context. We will present complexity reduction techniques for MIMO systems in cooperative systems. Furthermore, we will present architectures for flexible and configurable MIMO detectors. These architectures could support a range of data rates, modulation orders and numbers of antennas, and therefore, are crucial in the different nodes of cooperative systems. The breadth-first search employed in our realization presents a large opportunity to exploit the parallelism of the FPGA in order to achieve high data rates. Algorithmic modifications to address potential sequential bottlenecks in the traditional bread-first search-based SD are highlighted in the thesis. We will present a novel Cooperative Partial Detection (CPD) approach in MIMO relay channels, where instead of applying the conventional full detection in the relay, the relay performs a partial detection and forwards the detected parts of the message to the destination. We will demonstrate how this approach leads to controlling the complexity in the relay and helping it choose how much it is willing to cooperate based on its available resources. We will discuss the complexity implications of this method, and more importantly, present hardware verification and over-the-air experimentation of CPD using the Wireless Open-access Research Platform (WARP).Item Demonstration of Piecewise Cubic Polynomial Fitting on Mesoscale Tester Data(Rice University, 2020) Mehta, Shail Maharshi; De Santos, Diego Ricardo; Sridhar, Shweta; Aguayo, Veronica Cristina; Meraz, Carlos Alberto; Mikos, Mary; Grande-Allen, K. Jane; BioengineeringAnimation of piecewise cubic polynomial fitting to data from a replicate of 1:15 PDMS performed in order to obtain stress values at exact increments of strainItem The Dizzident: Lia Perjovschi(Rice University, 2007-10-04) Hooper, Rachel; Walker Art CenterArtist entry on Lia Perjovschi in "Brave New Worlds" catalogueItem Imprints of Conflict: Yael Bartana(Rice University, 2007-10-04) Hooper, Rachel; Walker Art CenterArtist entry on Yael Bartana for "Brave New Worlds" catalogueItem Induction and Intuition, on the Center for Land Use Interpretation's Metholology(Rice University, 2009-01-01) Hooper, Rachel; Cynthia Woods Mitchell Center for the Arts at the University of Houston; Blaffer GallerySince 1994, The Center for Land Use Interpretation (CLUI)--a research organization based in Culver City, California--has studied the U.S. landscape, using multidisciplinary research, information processing and interpretive tools to stimulate thought and discussion around contemporary land-use issues. During a residency at the University of Houston Cynthia Woods Mitchell Center for the Arts, the CLUI established a field station on the banks of the Buffalo Bayou, revealing aspects of the relationship between oil and the landscape in Houston that are often overlooked--even by the city's residents. The CLUI's findings are presented in this volume and a concurrent exhibition at the Blaffer Gallery, titled Texas Oil: Landscape of an Industry. The book documents the CLUI's methodology in a series of interviews and includes a photographic essay on land use in Houston featuring a panoramic, foldout section and a comprehensive chronology of the CLUI's projects and publications over the past 14 years.Item Josephine Meckseper Interviewed by Rachel Hooper(Rice University, 2011-03-16) Hooper, Rachel; Meckseper, Josephine; Sharjah Art FoundationInterview with Josephine Meckseper included in the catalogue for Sharjah Biennial 10 "Plot for a Biennial"Item jsut that way(Rice University, 2012-09-26) Hooper, Rachel; University of Texas Press and Blaffer Art MuseumAndy Coolquitt makes objects and environments that exist in symbiosis with human relationships. During the 1990s, his life and work revolved around an expansive studio/artist commune/performance space/living sculpture/party place on the east side of Austin, Texas, where he continues to live, work, and host events. Intrigued by social contracts, Coolquitt creates artwork that facilitates conversation and interaction, augmenting the energy and frictions generated by individuals forming a community. He chooses materials that show the wear and tear of practical use, and, over the years, he has refined an artistic practice based on the collection, study, and reuse of things scavenged from the streets around him. Since his 2008 solo exhibition iight in New York City, Coolquitt's work has gained a wide national and international audience. Andy Coolquitt is the first comprehensive monograph on the artist's work. Published in conjunction with a solo museum exhibition at Blaffer Art Museum, this volume displays the full range of Coolquitt's work over the past twenty-five years, including images of site-specific installations that no longer exist. Accompanying the color plates are an introduction and chronology of the artist's work by exhibition curator Rachel Hooper, an essay tracing Coolquitt's connections to other contemporary artists and designers by Frieze magazine senior editor Dan Fox, an in-depth exploration of Coolquitt's concepts and process by art writer Jan Tumlir, an interview with Coolquitt by director and chief curator of White Columns Matthew Higgs, and Coolquitt's biography and bibliography.Item Learning the Structure of High-Dimensional Manifolds with Self-Organizing Maps for Accurate Information Extraction(Rice University, 2011) Zhang, Lili; Merenyi, ErzsebetThis work aims to improve the capability of accurate information extraction from high-dimensional data, with a specific neural learning paradigm, the Self-Organizing Map (SOM). The SOM is an unsupervised learning algorithm that can faithfully sense the manifold structure and support supervised learning of relevant information from the data. Yet open problems regarding SOM learning exist. We focus on the following two issues. 1. Evaluation of topology preservation. Topology preservation is essential for SOMs in faithful representation of manifold structure. However, in reality, topology violations are not unusual, especially when the data have complicated structure. Measures capable of accurately quantifying and informatively expressing topology violations are lacking. One contribution of this work is a new measure, the Weighted Differential Topographic Function (WDTF), which differentiates an existing measure, the Topographic Function (TF), and incorporates detailed data distribution as an importance weighting of violations to distinguish severe violations from insignificant ones. Another contribution is an interactive visual tool, TopoView, which facilitates the visual inspection of violations on the SOM lattice. We show the effectiveness of the combined use of the WDTF and TopoView through a simple two-dimensional data set and two hyperspectral images. 2. Learning multiple latent variables from high-dimensional data. We use an existing two-layer SOM-hybrid supervised architecture, which captures the manifold structure in its SOM hidden layer, and then, uses its output layer to perform the supervised learning of latent variables. In the customary way, the output layer only uses the strongest output of the SOM neurons. This severely limits the learning capability. We allow multiple, k, strongest responses of the SOM neurons for the supervised learning. Moreover, the fact that different latent variables can be best learned with different values of k motivates a new neural architecture, the Conjoined Twins, which extends the existing architecture with additional copies of the output layer, for preferential use of different values of k in the learning of different latent variables. We also automate the customization of k for different variables with the statistics derived from the SOM. The Conjoined Twins shows its effectiveness in the inference of two physical parameters from Near-Infrared spectra of planetary ices.Item Lessons in Buffoonery and Bravado: Erik Van Lieshout(Rice University, 2007) Hooper, Rachel; Walker Art CenterArtist entry on Erik Van Lieshout for "Brave New Worlds" catalogue
- «
- 1 (current)
- 2
- 3
- »