de Hoop, Maarten V.2024-05-202024-052024-01-18May 2024Benitez, Antonio Lara. Out-of-distributional risk bounds for neural operators with applications to the Helmholtz equation. (2024). Masters thesis, Rice University. https://hdl.handle.net/1911/115909https://hdl.handle.net/1911/115909EMBARGO NOTE: This item is embargoed until 2024-11-01Deep learning has emerged as an incredibly successful and versatile approach within the field of machine learning, finding applications across a diverse range of domains. Originally devised for tasks such as classification and natural language processing, deep learning has made significant inroads into scientific computing. Architectures like Deeponet and Neural Operators have showcased their potential in approximating operators defined by partial differential equations (PDEs). While these architectures have shown practical success, there remains a compelling need to delve deeper into their theoretical foundations. This thesis aims to contribute to the theoretical understanding of deep learning by applying statistical learning theory to the neural operator family. Our primary focus will be on the generalization properties of this family while addressing the challenges posed by the high-frequency Helmholtz equation. To achieve this, we propose a subfamily of neural operators, known as sequential neural operators, which not only preserves all the approximation guarantees of neural operators but also exhibits enhanced generalization properties. This design draws inspiration from the self-attention mechanism found in the ubiquitous transformer architecture. To analyze both neural operators and sequential neural operators we establish upper bounds on Rademacher complexity. These bounds are instrumental in deriving the corresponding generalization error bounds. Furthermore, we leverage Gaussian-Banach spaces to shed light on the out-of-risk bounds of traditional neural operators and sequential neural operators.application/pdfengCopyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.Neural operatorsstatistical learning theorydeep learningout-of-distributionRademacher complexityGaussian-Banach spacesrisk boundsOut-of-distributional risk bounds for neural operators with applications to the Helmholtz equationThesis2024-05-20