Rice University Theses and Dissertations
Permanent URI for this collection
Rice University makes all graduate theses and dissertations (1916-present) available online at no cost to end users.
Occasionally a thesis or dissertation may be be missing from the repository. If you are unable to find a specific dissertation, please let us know and we will attempt to make it available through the repository, provided that the author has not elected for it to be embargoed.
News
Visit the web site for Rice University's Office of Graduate and Postdoctoral Studies for more information about Rice graduate student requirements for theses and dissertations.
Browse
Recent Submissions
Now showing 1 - 20 of 14383
Item Embargo Utilizing Pairwise Interactions to Understand Collagen Triple Helix Assembly(2024-07-30) Cole, Carson Calvin; Hartgerink, Jeffrey D.; Marti-Arbona, AngelCollagen, the most abundant protein in the human body, exhibits diverse physical structures upon assembly. Despite the remarkable similarity in the chemical sequence of amino acids within the collagen triple helix, the origins of the resulting macromolecular properties remain poorly understood in some cases. This thesis investigates collagen supramolecular assembly by synthesizing, characterizing, and applying a range of collagen mimetic peptides to elucidate pairwise interaction geometry, collagen folding, and polymerization. Collagenous proteins have the canonical amino acid sequence of Xaa-Yaa-Gly, where the Xaa is frequently proline and the Yaa is 4-hydroxyproline, respectively. Substitutions of the Xaa and Yaa positions lead to the relative stability of a pairwise interaction in the triple helix. By analyzing the stabilizing and destabilizing effects of pairwise cation-π interaction geometries between cationic and aromatic amino acids in two sequential relationships, axial and lateral, we showed only the axial relationship is stabilizing. By understanding the nuances of amino acid presentation in the triple helix, lateral charge pairs stabilized the triple helix when the cation was in the Xaa position, and the anion was in the Yaa position. As a result, we expanded the design toolbox for collagen mimetic design. Most natural collagens are heterotrimeric, where the three strands are nonequivalent. Incorporating these pairwise interactions yielded an ABC-type heterotrimeric crystal structure with excellent specificity. This specificity enhanced the folding rate of the triple helix, and we confirmed that the folding paradigm was concentration-independent. We further demonstrated that heterotrimeric collagens undergo an equilibrium-mediated assembly process that necessitates complete unfolding before adopting its constituent strands’ correct ABC register. Furthermore, this work incorporated cation-π interactions to develop a hydrogel material emulating fibrous collagens. The peptide structure formed diverse, porous, and fibrous networks upon assembly. In addition, by simple oxidation, we demonstrated that templating the cation-π interaction can introduce intrahelical and inter-helical covalent bonds. Given the critical role of the triple helix as the foundation for the macromolecular properties found in the human body, these findings demonstrate the potential to advance our understanding of collagen folding in disease and create next-generation biomimetic scaffolds.Item Embargo On Graphs with Finite-Time Consensus and Their Use in Gradient Tracking(2024-05-20) Nguyen, Edward Duc Hien; Uribe, CésarA crucial design decision when employing distributed or decentralized optimization algorithms in practice is the choice of topology. A topology should be sufficiently well connected such that when agents communicate, agents reach a consensus faster. However, more densely connected topologies come with a price of higher bandwidth cost or latency. To address this issue, we study sequences of graphs satisfying the finite-time consensus property (i.e., iterating through such a finite sequence is equivalent to performing global or exact averaging) and their use in the decentralized optimization algorithm Gradient Tracking. We provide an explicit weight matrix representation of the studied sequences and prove their finite-time consensus property. Moreover, we incorporate the studied finite-time consensus topologies into Gradient Tracking and present a new algorithmic scheme called Gradient Tracking for Finite-Time Consensus Topologies (GT-FT). We analyze the new scheme for nonconvex problems with stochastic gradient estimates. Our analysis shows that the convergence rate of GT-FT does not depend on the heterogeneity of the agents' functions or the connectivity of any individual graph in the topology sequence. Furthermore, owing to the sparsity of the graphs, GT-FT requires lower communication costs than Gradient Tracking using the static counterpart of the topology sequence.Item Embargo Unnatural Disasters: Healing Epistemic Invisibility Through Digital Archiving(2024-08-06) Graham, Lindsay Diane; Ostherr, Kirsten; Michie, Helena; Howe, CymeneUnnatural Disasters: Healing Epistemic Invisibility Through Digital Archiving identifies the emerging genre of the digital disaster archive and argues that this genre exposes the concept of disaster as deeply entangled with and produced by epistemological erasure. Invested in the concept of “disaster” as unnatural and unequivocally social, political, and temporal, I connect a growing awareness of environmental precarity to this new kind of digital memory practice to examine how the archive’s methodology and infrastructure engender a vital politics of accountability. I argue that the digital archive is uniquely suited to address epistemologically produced invisibility by challenging the historical processes and systems that lead to disaster; in so doing, the archive proffers an expanded understanding of health, healing, and care from the margins. This project considers collections from disparate geographies, cultures, and languages that respond to the 2010 Haitian earthquake, the 1989 Exxon Valdez oil spill (digitized in 2010), Japan’s triple disaster (2011), and the SARS-CoV-2 pandemic (2020) to explore how the digital archive is both an open encounter and an exchange of knowledge and power. To specifically reflect on the healing affordances of this increasingly popular form of cultural production, I critically approach the archive through the concepts of digital witnessing, archival methodology as recovery, healing nostalgia, digital self-help, and digitality as care, while grappling with the ways that digital tools can exacerbate or expose the unevenness of vulnerability. By delineating the socio-political processes that lead to disaster and by demonstrating how the digital archive is a potential site of activism that de-naturalizes and thus re-politicizes disaster, Unnatural Disasters charts new paths for critical disaster studies and global digital humanities and ultimately argues for a translational digital humanities approach to global disaster response and humanitarian aid.Item Manufacturing Chip-Scale 2D Monolayer Single Crystals and Engineering Quantum Emission in 2D Materials(2024-08-05) Wu, Wenjing; Huang, Shengxi; Kon, JunichiroTwo-dimensional (2D) materials and their van der Waals (vdW) heterostructures continue to reveal unconventional electronic, optical, and magnetic phenomena closely tied to their dimensionality. In the first part of this thesis, we have demonstrated a facile method for producing uniform, large-area, and crack-free single-crystal transition metal dichalcogenide (TMD) monolayers and artificial structures: wafer-bonder-assisted transfer (WBAT). Compared with single-crystal monolayers produced via traditional Scotch tape exfoliation, the WBAT method can produce flakes that are larger in area by > 10^6 times with almost no cracks. In the second part, we focus on the creation of single photon emitters in the WSe2 and WS2 thin flakes, with defect and strain engineering. Our results show a nearly ideal single-photon purity with g^2(0) = 0.03 through effective spectral background suppression.Item Embargo Muslim, Sub-Saharan African, and Native American Bodies as European Furnishings, 1500–1700(2024-08-05) Kim, Dasol; Wolfthal, DianeMy doctoral dissertation investigates the European production and reception of metalwork depicting the Other from the Christian European perspective. I study bronze statuettes, swords, and silver cups made in German-speaking regions and Italy between 1500 and 1700, with a focus on their use in the Holy Roman Empire. These small metal furnishings reflected and constructed the Christian European elites’ conception of depicted groups by being seen, touched, and smelled at courts, city halls, and burgher houses. Chapter one reveals the humanist lens of the Ottoman Empire behind the classical format and placement of bronze Turks in a German residence. Chapter two studies candlesticks and lamps depicting Muslims and Africans grafted onto vegetal and architectural motifs. The moving flame and smoke animated such grotesques that allude to the Otherness. Chapter three discusses how princes and patricians promoted their masculine identities and war propaganda by displaying and wearing swords in the shape of Muslims or Africans. Chapter four studies a rare wager cup incorporating the image of a bearded man wearing a kaftan, a Hungarian or Turkish soldier, into that of a fashionable Italian woman wearing a narrow bodice and a conical skirt. Because its image transgresses gender and ethnicity, the cup offers a nuanced case study of Renaissance Otherness. Chapter five studies wine vessels supported by sculptures of Native American or Afro-American men. Precious metals and edible ingredients like sugar and tropical fruit informed the iconography of these cups. Portable and in proximity, small utensils shaped Renaissance European understanding of the religious and ethnic Other in their everyday lives. My doctoral dissertation bridges the gap between European decorative arts studies and the study of the image of Otherness in European art by shedding light on understudied small metalwork. I adopt sensory studies, performance studies, gender studies, and the issue of class and labor, the methods rarely employed to understand the image of Muslims, Africans, and Native Americans in European decorative arts.Item Embargo Upon this Rock I Build my Church: Borgund Stave Church and the Aesthetics of Permanence(2024-06-25) Westich, Stephen; Neagley, LindaBeginning in the twelfth century, the ‘stave’ building technique became popular for parish churches in Norway and possibly throughout northern Europe. By the nineteenth century, these churches could only be found in Norway. Designated important monuments of national identity they were grouped together based on this technique and are now known as the Norwegian stave churches. The best-preserved stave church remaining in situ is Borgund stave church. Borgund has been held in high esteem for the remarkable condition of its timber material and has served as a model for restorers in the nineteenth century as well as an icon for Norwegian culture throughout the world. Despite the prevalence of Borgund’s image, this will be the first art historical study to examine it as a functioning building in the twelfth-century context in which it was constructed. This project investigates the spatiality and materiality of Borgund stave church. Drawing on theories of space and materiality, combined with the results of archaeological excavations and conservation work, close readings of texts contemporary to the church, and the building’s form, I argue that its design has as an overriding principle of an ‘aesthetics of permanence.’ I define this concept as the intention of developing a building form that not only has the capability of great endurance, but that this endurance is made manifest in the aesthetic design of the building. Through this the church emphasizes its permanence against the decay and mutability of the natural world, both literally and symbolically, and produces a space for the exercise of power. A narrow passageway circling the building offers a measure of protection to the building, but also highlights the threshold of sacred space. This liminality that emphasizes the borders between sacred and secular space parallels and participates in the broader political context of border creations that produced new margins both legal and theological. The sophisticated carpentry combined with the masonry at its foundation, the rocks on which the church stands, demonstrate the desire of the builders to create a church that manifested power over nature and an ambition for permanence.Item Embargo Harnessing Chirality in Ordered Carbon Nanotube Architectures at Wafer Scale(2024-08-09) Doumani, Jacques; Kono, JunichiroHarnessing chirality can advance diverse technologies, encompassing displays, quantum light sources, secured communication, and biosensing. This thesis explores harnessing chirality in carbon nanotubes (CNTs) at wafer scales, focusing on molecular intrinsic and structurally engineered chirality. Significant advancements were made in alignment techniques, second harmonic generation (SHG) from aligned and chiral CNT films, and engineering structural chirality in CNTs. We developed techniques to enhance CNT alignment using controlled vacuum filtration, including linear reciprocating shaking, and introduced a novel SEM-based method for characterizing nematic order parameters. The AquaGold process was developed for monolayer precision thinning, achieving a wafer-scale aligned CNT film with a thickness of 2.3 nm and a packing factor of 1000 CNTs/μm. Through SHG measurements, we discovered significant second-order optical nonlinearity in wafer-scale, enantiomer-pure, aligned, and densely packed chiral (6,5)- CNT thin films. The only non-zero element of the second-order nonlinear optical susceptibility tensor reached approximately 1.5 nm/V, the highest value for 1D systems. We also fabricated wafer-scale chiral architectures of ordered CNTs with tunable and large circular dichroism. By controlling the stacking angle and handedness, we achieved a high deep-ultraviolet ellipticity of 40 mdeg/nm. These findings pave the way for applications in chiral photonics and opto-electronics.Item “For Richer, For Poorer, in Sickness and Health”: Gendered Earnings Compositions and Their Effect on the Health of African American Couples(2024-08-08) Parsons, Shahill; Bratter , JeniferThis study examines how gendered earnings compositions affect the physical health of married African American couples, using data from the National Health Interview Survey (2014-2018). The research employs logistic regression models to evaluate health disparities within this population, drawing on theories of Hegemonic Masculinity and Hegemonic Femininity. Findings reveal complex relationships between earnings composition and health outcomes. Women in equal-earning marriages showed lower odds of poor physical health, while those with husbands as sole earners had higher odds of fair to poor health. For men, gendered earnings compositions were not significantly associated with physical health in fully adjusted models, suggesting a possible "Hybrid Masculinity" adaptation. Couples with egalitarian earning compositions consistently demonstrated better health outcomes across all models. The results highlight the need for culturally specific frameworks that account for the unique experiences of African American families.Item Embargo Effective Techniques for Managing Intermediate-Sized Superpages(2024-08-09) Solomon, Eliot Hutton; Cox, Alan LTranslation lookaside buffers (TLBs) are pieces of hardware that cache the results of expensive address translations, improving the performance of the virtual memory system. Design constraints make it impossible for TLBs to store more than a few thousand entries, so "superpages" allow the operating system to instruct the TLB to cache a larger block of memory using a single entry. For small, frequently used memory objects like files and shared libraries, it can be difficult for the operating system to appropriately trade off the memory fragmentation induced by creating a 2 MB superpage with the performance benefits that doing so provides. Because of this, we investigate emerging hardware support for smaller “intermediate-sized” superpages. The first phase of our work explores PTE Coalescing, a feature of AMD Ryzen processors that transparently forms 16 KB or 32 KB superpages from aligned and contiguous groups of 4 KB base pages. We develop a custom microbenchmark to infer details of PTE Coalescing’s hardware implementation. We then determine that the contiguity generated by the Linux and FreeBSD physical memory allocators is insufficient to enable much coalescing and that reservation-based allocation is a good technique for generating additional contiguity to enhance PTE Coalescing. In the second phase of our work, we introduce the first production system capable of simultaneously managing two superpage sizes for file-backed and anonymous mappings by implementing support in the FreeBSD kernel for non-transparent 64 KB superpages on the ARM architecture using the latter’s Contiguous bit feature. We observe a 13.83% improvement in an exec() microbenchmark, a 6.83% boost in Node.js rendering performance, and a 11.18% speedup in a compilation-centric workload. More aggressive superpage promotion policies can further increase the performance benefits; we can boost the speedup to 15.67% using the right policy for the compilation-heavy workload.Item Embargo Formally Verified Algorithms for Temporal Logic and Regular Expressions(2024-08-09) Chattopadhyay, Agnishom; Mamouras, KonstantinosThe behavior of systems in various domains including IoT networks, cyber-physical systems and runtime environments of programs can be observed in the form of linear traces. **Temporal logic** and **regular expressions** are two core formalisms used to specify properties of such data. This thesis extends these formalisms to enable the expression of richer classes of properties in a succinct manner together with algorithms that can handle them efficiently. Using the Coq proof assistant, we formalize the semantics of our specification languages and verify the correctness of our algorithms using mechanically checked proofs. The verified algorithms have been extracted to executable code, and our emperical evaluation shows that they are competitive with state-of-the-art tools. The first part of the thesis is focused on investigating the formalization of an online monitoring framework for past-time metric temporal logic (MTL). We employ an algebraic quantitative semantics that encompasses the Boolean and robustness semantics of MTL and we interpret formulas over a discrete temporal domain. A potentially infinite-state variant of Mealy machines, a kind of string transducers, is used as a formal model of online monitors. We demonstrate a compositional construction from formulas to monitors, such that each monitor computes (in an online fashion) the semantic values of the corresponding formula over the input stream. The time taken by the monitor to process each input item is proportional to O(|φ|) where |φ| is the size of the formula, and is independent of the constants that appear in the formula. The monitor uses O(m) space where m is the sum of the numerical constants that appear in the formula. The latter part of the thesis is focused on regular expressions. Regular expressions in practice often contain lookaround assertions, which can be used to refine matches based on the surrounding context. Our formal semantics of lookarounds complements the commonly used operational understanding of lookaround in terms of a backtracking implementation. Widely used regular expression matching engines take exponential time to match regular expressions with lookarounds in the worst case. Our algorithm has a worst-case time complexity of O(m · n), where m is the size of the regex and n is the size of the input string. The key insight is to evaluate the lookarounds in a bottom-up manner, and guard automaton transitions with oracle queries evaluating the lookarounds. We demonstrate how this algorithm can be implemented in a purely functional manner using marked regular expressions. The formal semantics of lookarounds and our matching algorithm is verified in Coq. Finally, we investigate the formalization of a tokenization algorithm. Tokenization is the process of breaking a monolithic string into a stream of tokens. This is one of the very first steps in the compilation of programs. In this setting, the set of possible tokens is often described using an ordered list of regular expressions. Our algorithm is based on the simulation of the Thompson NFA of the given regular expressions. Two significant parts of the verification effort involve demonstrating the correctness of Thompson's algorithm and the computation of ε-closures using depth-first search. For a stream of length n and a list of regular expressions of total size m, our algorithm finds the first token in O(m · n) time, and tokenizes the entire stream in O(m · n^2) time in the worst-case.Item Coupled Flow and Transport in an Organ and its Vasculature(2024-08-08) Tzolova, Bilyana; Riviere, Beatrice; Fuentes, DavidIn contrast to many other types of cancer, the incidence of liver cancer, specifically hepatocellular carcinoma (HCC), is on the rise. For most patients, surgical intervention is not a viable option, leaving them reliant on chemotherapy treatments, particularly transarterial chemoembolization (TACE), for relief. Our study aims to understand how these treatments function within the liver and their impact on tumor growth. Building upon existing research, we model the flow and transport of chemotherapy drugs and embolic agents in the liver using the miscible displacement equations. Utilizing CT images from liver cancer patients, we extract a 1D centerline of the hepatic vascular structures that deliver blood to the tumors, and then construct a 3D mesh from the liver segmentations. We employ the singularity subtraction technique to create a finite element model for the flow of blood in the liver, specifically focusing on areas affected by the TACE treatment. We extend the singularity subtraction technique to the time-dependent advection-diffusion equation to model the concentration of chemotherapy drugs in the liver and tumors. We first solve the time-dependent non-conservative advection-diffusion equation using the finite element method. To address instabilities arising when the model is advection dominated, we then utilize the discontinuous Galerkin method to solve the time-dependent conservative advection-diffusion equation. We couple the models for blood flow following the injection of an embolic agent with the transport of chemotherapy to develop a comprehensive model based on the miscible displacement equations in the liver. We then apply the simulation to data from MD Anderson patients diagnosed with hepatocellular carcinoma who have undergone transarterial chemoembolization treatment. This final model enables us to provide insights into the evolving dynamics of TACE within the liver.Item Embargo The Development of Iron Photoredox Strategies in Organic Transformations for the Construction of C–N and C–C Bonds(2024-08-09) Kao, Shih-Chieh; West, Julian G.Organic transformations via photoredox strategies have emerged as a powerful tool for the synthesis of valuable chemical products; however, prevailing strategies heavily relied on the use of precious transition-metal catalysts such as iridium, ruthenium or silver, impeding their general applications. The motivation of this work was to develop new catalytic systems using cheap and sustainable earth-abundant element metals that allow efficient synthesis of otherwise challenging molecules via photoredox mechanisms. Vicinal diamines are prevalent in bioactive molecules, pharmaceuticals, and molecular catalysts, underscoring their significance. Olefin diazidation emerges as a promising strategy for synthesizing these motifs due to the ability to reduce azides to amines and the availability of structurally diverse olefins. Traditional diazidation methods often involve harsh conditions and have limited substrate scopes. Recent advancements using azidobenziodoxolone (ABX, Zhdankin reagent) or electrochemical methods, overcome some of these challenges; however, they introduce different limitations such as high costs and procedural complexity. In chapter 1, we introduce an innovative photochemical diazidation approach utilizing cheap iron salts, which leverages visible light-induced homolysis (VLIH) and radical ligand transfer (RLT). This methodology, which eschews the need for additional oxidants or complex reaction apparatus, offers sustainability and economic benefits while demonstrating compatibility with continuous flow chemistry, providing an efficient and practical route for the synthesis of organic diazides. Preliminary mechanistic studies support the radical nature of the cooperative process in photochemical diazidation, demonstrating this approach as a highly effective method for olefin difunctionalization. Decarboxylative functionalization is a potent strategy for synthesizing diverse products, with ligand-to-metal charge transfer (LMCT) involving earth-abundant 3d metals emerging as a prominent method for reaction design. While recent advancements in coppermediated decarboxylative C–N bond formation via a LMCT/radical polar crossover (RPC) mechanism have been demonstrated, they face limitations in catalytic function and substrate scope with unactivated alkyl carboxylic acids, challenging their general applicability. In chapter 2, we present a novel photochemical, nucleophilic decarboxylative azidation using iron-catalyzed visible light-induced homolysis (VLIH) and radical ligand transfer (RLT). Our proposed iron-catalyzed approach leverages inexpensive iron nitrate and simple azide sources to convert a variety of carboxylic acids into organic azides under mild conditions. This method avoids the need for external oxidants, complex ligands, or pre-activation of carboxylic acids. Mechanistic studies suggest a radical pathway with nitrate acting as an internal oxidant, offering a “redox-neutral” transformation. This new methodology provides a straightforward and efficient route for synthesizing aliphatic azides, expanding the toolkit for C-N bond formation in pharmaceutical and organic synthesis. Hydroalkylation, the addition of a carbon fragment and hydrogen across an alkene, is an optimal method for forming C(sp3)–C(sp3) bonds from readily available starting materials. Despite notable advancements in branch-selective hydroalkylation via transition metal catalysis, a general strategy for linear-selective hydroalkylation remains underdeveloped, with certain reactions, such as hydroethylation, remaining largely elusive. In chapter 3, we demonstrate that these challenging reactions can be achieved catalytically through traceless radical polarity reversal (TRPR), which employs a removable electron-withdrawing group to facilitate radical alkene addition, followed by in situ removal under reaction conditions. This approach facilitates a variety of previously unattainable hydroalkylation reactions, such as hydromethylation, hydroethylation, and hydrocyclobutylation, all with high linear selectivity using simple malonic acids as alkyl donors. Additionally, it allows for the rapid and efficient synthesis of bioactive molecules, exemplified by a GPR119 agonist, which was produced in good yield and efficiency. To further showcase the practical utility of our approach, we demonstrated that mono- or difluoromalonic acids can act as novel mono- or difuoromethylene linchpins for accessing gem-mono- or gem-difuoroalkyl skeletons from abundant feedstock chemicals. Importantly, our unified iron/thiol dual catalytic system manages both the alkene addition and the removal of the polarity reversal group, offering a sustainable and straightforward method for these transformations. Preliminary mechanistic studies suggest a dual-catalytic radical mechanism involving decarboxylation via ironmediated visible light-induced homolysis (VLIH) and hydrogen atom transfer steps. Overall, traceless radical polarity reversal provides a versatile solution for alkene hydroalkylation in both simple and complex settings. In general, these new iron photoredox strategies have shown great advantages and improved sustainability compared to the previous work using noble and expensive transition metals and/or harsh conditions. In addition, these strategies allowed us to study the unprecedent catalytic properties of earth-abundant elements and synthesize a wide variety of valuable and traditionally inaccessible molecules, advancing fundamental knowledge in the fields of homogeneous catalysis and synthesis.Item Towards Scalable and Robust Integrated Task and Motion Planning in the Real World(2024-08-08) Pan, Tianyang; Kavraki, Lydia E.Advanced robots are expected to be used in more and more complex and unstructured settings in the future. Robots can now be deployed in factories to repeatedly execute human-designed routines with high robustness. However, to accomplish complex tasks in unstructured settings, the robots must have the capability of reasoning over the task. Task and Motion Planning (TAMP) is a class of methods that combine both high-level task planning and low-level motion planning that enables a robot to reason over both what steps must be taken to finish a task and how to actually do it. Traditional TAMP literature poses strong assumptions on the class of problems when it is applicable. For example, it is typically assumed that the robot has perfect sensing and execution capabilities, and thus it suffices to find a sequence of motions to finish a task with a real robot. Moreover, many existing TAMP methods focus on single-robot cases and implicitly assume they can scale to multi-robot systems. Such assumptions usually do not hold true when more complex tasks need to be solved (e.g., when execution uncertainty cannot be ignored, or when the task requires coordinating dozens of robots). This work focuses on relaxing such assumptions and proposes novel formulations and frameworks to address a richer set of problems. We combine the typical TAMP paradigm with statistical models such as Bayesian updates to efficiently reason over the robustness of robotic executions. Beyond execution uncertainties, we also extend our work to consider a more general class of problems, where the robot has various types of knowledge gaps of the world, including object occlusions, unknown objects, etc. To address such challenges, we combine typical TAMP methods with provided execution-level modules, called behaviors, to enable a novel general framework that can discover geometric constraints during planning time instead of real-robot execution time, to finish real-world tasks more efficiently. We also extend a typical TAMP solver to multi-robot problem settings, where we introduce an additional intermediate layer to reason over specific variables, largely increasing the planning efficiency. Lastly, we propose two novel execution frameworks for multi-mobile-robot navigation tasks, combining feedback controller design with sampling-based motion planners and multi-agent path-finding algorithms, to solve such tasks under unknown uncertainties in the high-order dynamic model. Such frameworks are proven to be applicable as execution-level modules to general TAMP pipelines.Item Predicting Liver Segmentation Model Failure with Feature-Based Out-of-Distribution Detection and Generative Adversarial Networks(2024-08-07) Woodland, McKell; Patel, Ankit B; Jermaine, Christopher M; Brock, Kristy KAdvanced liver cancer is often treated with radiotherapy, which requires precise liver segmentation. Deep learning models excel at segmentation but struggle on unseen data, a problem exacerbated by the difficulty of amassing large datasets in medical imaging. Clinicians manually correct these errors, but as models improve, the risk of clinicians overlooking mistakes due to automation bias increases. To ensure quality care for all patients, this thesis aims to offer automated, scalable, and interpretable solutions for detecting liver segmentation model failures. My first approach prioritized performance and scalability. It applied the Mahalanobis distance (MD) to the features of four Swin UNETR and nnU-net liver segmentation models. I proposed reducing the dimensionality of these features with either principal component analysis (PCA) or uniform manifold approximation and projection (UMAP), resulting in improved performance and efficiency. Additionally, I proposed a k-th nearest neighbors distance (KNN) as a non-parametric alternative to the MD for medical imaging. KNN drastically improved scalability and performance on raw and average-pooled bottleneck features. My second approach emphasized interpretability by introducing generative modeling for the localization of novel information that a model will fail on. It employed a StyleGAN2 network to model a distribution of 3,234 abdominal computed tomography exams (CTs). It then localized metal artifacts and abnormal fluid buildup, two prevalent causes of liver segmentation model failure, in 55 CTs by reconstructing the scans with backpropagation on the StyleGAN’s input space and focusing on the regions with the highest reconstruction errors. The computational cost, data requirements, and training complexity of generative adversarial networks, along with a lack of reliable evaluation measures, have impeded their application to medical imaging. Accordingly, a significant portion of this thesis is dedicated to evaluating the applications of StyleGAN2 and the Fréchet Inception Distance (FID), a common measure of synthetic image quality, to medical imaging. The principal contributions of this thesis are integrating PCA and UMAP with MD, utilizing KNN for out-of-distribution detection in medical imaging, leveraging generative modeling to localize novel information at inference, providing a comprehensive application study of StyleGAN2 to medical imaging, and challenging prevailing assumptions about the FID in medical imaging.Item Embargo Graph-based Learning for Efficient Resource Allocation in Wireless Networks under Constraints(2024-08-08) Chowdhury, Arindam; Segarra, SantiagoOptimal allocation of resources, such as power and bandwidth, is essential for increasing spectral efficiency and improving effective network capacity to meet the high quality-of-service (QoS) requirements of modern wireless systems. This is especially challenging under randomly varying channel characteristics and user demands. In particular, power allocation in a wireless network is crucial to mitigate multi-user interference, one of the main performance-limiting factors. The task of interference management is framed as a utility maximization problem under instantaneous and/or time-varying power constraints. Such formulations are NP-hard, and the existing solutions are expensive yet sub-optimal at best. Recently, deep learning algorithms have been extensively employed to obtain approximate solvers efficiently. In particular, graph-based models have been shown to be most effective in leveraging the irregular connectivity structure of wireless networks. In this thesis, we focus on developing near-optimal, generalizable, lightweight, and robust Graph Neural Network (GNN)-based algorithms for effectively solving NP-hard optimization problems in wireless systems under instantaneous and time-varying constraints. The first part of this work specializes in designing domain-informed graph-ML algorithms by leveraging the paradigm of algorithm unfolding for fast and efficient instantaneous power allocation in SISO wireless ad hoc networks (WANET) with theoretically guaranteed convergence and robustness. The next part involves extending the unfolded solution and the theoretical analyses to address the optimal beamforming problem in MISO and MIMO interference networks under max-power constraints. The final part leverages constrained reinforcement learning algorithms for episodic sum-rate and harmonic-fairness maximization under time-varying battery constraints and channel conditions in mobile WANETs (MANET). Through these bodies of work, this thesis develops a unified framework for power allocation under time-coupled physical and utility constraints in wireless networks. Through \blue{simulation experiments}, we demonstrate a consistent performance improvement over SOTA models, both in terms of system utility and inference time. We also establish the generalization performance of the proposed models across multiple network topologies, sizes, fading conditions, and battery states. Further, we show that the proposed architectures are computationally efficient and can be executed with minimal hardware requirements. The hybrid structure of the models enhances interpretability as well as acts as a fail-safe in case the learnable components are no longer effective. Finally, the proposed framework is flexible and can be seamlessly applied to multiple tasks, including resource allocation, security, and control in wireless networks and systems.Item Towards Robust Planning for High-DoF Robots in Human Environments: The Role of Optimization(2024-08-09) Quintero Pena, Carlos; Kavraki, Lydia E; Kyrillidis, AnastasiosRobot motion planning has been a key component in the race to achieve true robot autonomy. It encompasses methods to generate robot motion that meets kinematic constraints, robot dynamics and that is safe (avoids colliding with the environment). It has been particularly successful in efficiently finding motions for high degree-of-freedom robots such as manipulators, but despite tremendous advances, motion planning methods are not ready for human environments. The uncertainty, diversity and clutter of the human world challenge the assumptions of motion planning methods breaking their guarantees, rendering them useless or dramatically worsening their performance. In this thesis, we propose methods to address three important challenges in augmenting motion planning and long-horizon manipulation for human environments. First, we present a framework that enables human-guided motion planning and demonstrate how it can be used for safe planning in partially-observed environments. Second, we present two methods for safe motion planning in the presence of sensing uncertainty, one that requires the poses of segmented objects and another one that acts directly on distance information from a noisy sensor. Finally, we present a framework that dramatically improves the performance of long-horizon manipulation tasks in the presence of clutter for an important class of manipulation problems. All of our contributions have mathematical optimization as a connecting thread to synthesize high-dimensional trajectories using low-dimensional information or as a layer between high-level and low-level planners. Our results demonstrate how these formulations can be effectively used to augment motion planning and planning for manipulation in novel ways, attaining more robust, efficient and reliable methods.Item Structural Effects in 2D-Stabilized FAPbI3 Films and ToF-SIMS for Ultra-thin h-BN Fabrication(2024-08-06) Torma, Andrew Jonathon; Mohite, Aditya DThe next generation of electronics, photonics, and optoelectronics are based on advancements in semiconductor materials. In such, it is vital to gain understanding of and cultivate solutions for degradation pathways and engineer effective synthesis methods. This work details two thrusts that take a view through the stability and fabrication lens: first, structural changes in a novel method to stabilize halide perovskites, and second, a method for forming ultra-thin van der Waals materials. Thrust I: The halide perovskite formamidinium lead iodide (FAPbI3) is a prime candidate for photovoltaics due to its excellent optoelectronic properties, but its application has been limited due to its structural instability. The large size of the FA cation results in metastability of the photoactive cubic phase and a facile degradation into the thermodynamically stable hexagonal phase at room temperature. Recently, the incorporation of 2D Ruddlesden-Popper halide perovskite seeds into a FAPbI3 precursor solution has been shown to template the growth of and stabilize cubic FAPbI3. Here, we investigate the nanoscale structural and optoelectronic mechanisms behind the observed bulk stabilization using synchrotron-based x-ray microscopies. Nanoprobe x-ray diffraction reveals 2D-templated FAPbI3 films exhibit an average compressive strain normal to the substrate of -3.4%, two-fold larger than that of MACl-stabilized FAPbI3. Further, this compression creates locally templated regions comprised of tetragonal-phase FAPbI3 distributed non-uniformly throughout the film with fewer crystalline defects than purely cubic regions. Scanning x-ray excited optical luminescence (x-ray analogue of photoluminescence) reveals that this local templating results in increased radiative recombination and redshifted emission. Our results help better understand the structural phenomena resulting from stabilization methods in FAPbI3 for engineering durable photovoltaics. Thrust II: In recent years, hexagonal boron nitride (h-BN) has become a promising candidate for next-generation electronics and photonics, such as a gate dielectric in field effect transistors. However, methods for fabrication of ultra-thin materials often lack spatial control or require harsh environment depositions. Here, we report a method to prepare ultra-thin h-BN using the combination of micromechanical cleaving (i.e. Scotch Tape Method) and ion beam etching through time-of-flight secondary ion mass spectrometry (ToF-SIMS). ToF-SIMS is further employed for 3D reconstruction of h-BN flakes.Item Embargo A maverick in the pectin methylesterase family: PME31 acts in lipid droplet utilization(2024-08-06) Hamade, Sarah; Bartel, Bonnie; Braam, JanetIn plants, the primary form of energy stored form in seed lipid droplets, triacylglycerol (TAG), is catabolized during germination to support pre-photosynthetic growth. While this process is essential for seedling development, it is incompletely understood. In a screen for Arabidopsis thaliana mutants with delayed lipid droplet coat protein degradation, five independent mutations in PECTIN METHYLESTERASE31 (PME31) were recovered. In addition to delayed coat protein degradation, pme31 mutant seedlings exhibited sustained lipid droplets and elevated levels of several TAG and diacylglycerol species. Although structural prediction classified PME31 as a pectinesterase, it also resembled the putative E.coli lipase, YbhC. A fluorescent PME31 reporter was cytosolic and associated with peroxisomes, the site of fatty acid catabolism, during lipid mobilization. These findings suggest that, in contrast to most PMEs, which modify cell wall pectin, PME31 functions in lipid mobilization at the peroxisome.Item The Sociopolitical Implications of Blacks' Belief in the Significance of Systemic Racism(2024-08-08) Gorman, Quintin; Brown, Tony NThis dissertation project investigates the sociopolitical implications of racial capital, defined as Blacks’ belief in the significance of systemic racism. Prior racial attitude studies investigate Blacks’ tendency to endorse systemic (i.e., structural) versus individual (i.e., in-born ability, cultural, or motivational) explanations for racial inequality in U.S. society. Historically, Blacks overwhelmingly endorse systemic explanations for racial inequality. Yet, recent studies show increasing trends wherein Blacks endorse individual explanations for racial inequality. These recent findings expose heterogeneity in Black political thought. However, prior studies neglect the full implications of Blacks endorsing systemic explanations for racial inequality. To address this gap, this dissertation project analyzes a nationally representative sample of Black adults completing the Outlook on Life Surveys, 2012, to examine relationships between racial capital and perceptions of racial progress, political activities, and social capital. I address several questions: (1) Does racial capital associate with the perception Obama’s 2008 presidential election showed Blacks now enjoy racial equality? (2) Does racial capital associate positively with political activities? (3) Does racial capital associate positively with social capital? There are three broad takeaways from this dissertation project. First, Blacks gain capital from believing in the significance of systemic racism. It might be a new type of bonding capital. Second, there are capital, variously defined, disparities between high and low-SES Blacks. For example, racial capital’s benefits extend disproportionately to high-SES blacks. Stated differently, low-SES and dispossessed Blacks do not reap as much capital from believing in the significance of systemic racism. Third, racial capital merits further investigation. Not counting this dissertation project, few studies investigate racial capital. Capital gained from belief in the significance of systemic racism may extend beyond political activities and social capital. For example, racial capital may be consequential for Blacks’ mental health, psychological resources, physical health, and more.Item Estimation of Gaussian Graphical Models Using Learned Graph Priors(2024-08-06) Sevilla, Martin; Segarra, SantiagoWe propose a novel algorithm for estimating Gaussian graphical models incorporating prior information about the underlying graph. Classical approaches generally propose optimization problems with sparsity penalties as prior information. While efficient, these approaches do not allow using involved prior distributions and force us to incorporate the prior information on the precision matrix rather than on its support. In this work, we investigate how to estimate the graph of a Gaussian graphical model by introducing any prior distribution directly on the graph structure. We use graph neural networks to learn the score function of any graph prior and then leverage Langevin diffusion to generate samples from the posterior distribution. We study the estimation of both partially known and entirely unknown graphical models and prove that our proposed estimator is consistent in both scenarios. Finally, numerical experiments using synthetic and real-world graphs demonstrate the benefits of our approach.