Domain decomposition-based reduced-order models using nonlinear-manifolds and interpolatory projections

Date
2024-04-17
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract

This thesis integrates nonlinear-manifold and interpolatory projection model reduction with domain decomposition (DD) to reduce the offline costs of training reduced-order models (ROMs). In the neural network-based nonlinear-manifold ROM (NM-ROM) setting, the number of parameters requiring training scales with the full-order model (FOM) size. By applying DD, subdomain NM-ROMs can be trained in parallel with significantly fewer parameters compared to a global NM-ROM, resulting in less computationally expensive training. The DD NM-ROM approach algebraically decomposes a fully discretized FOM into algebraic subdomains, computes an NM-ROM for each subdomain, and minimizes the residual for each subdomain ROM while coupling them via compatibility constraints. This thesis begins with the steady-state setting, providing the algebraic DD formulation, developing the subdomain NM-ROM approach, detailing a Lagrange-Gauss-Newton sequential quadratic programming solver to evaluate the DD NM-ROM, and providing an a posteriori error analysis. The time-dependent extension is developed analogously to the steady-state case. The DD NM-ROM approach is compared with a closely related DD Proper Orthogonal Decomposition approach using the 2D Burgers' equation as a benchmark example for both the steady and unsteady cases.

The DD NM-ROM approach is constructed to approximate the FOM solution on each subdomain. Alternatively, one can apply ROM approaches that approximate the input-to-output map of a FOM, e.g., through interpolatory projection. For interpolatory ROMs, ROM training involves several evaluations of the resolvent of a dynamical system per interpolation point, thus requiring repeated, computationally expensive inversions of very large matrices. Applying DD lets one compute interpolatory ROMs for each subdomain in parallel, which only requires evaluating resolvents at the subdomain level, thus decreasing the ROM training cost. However, in the DD approach, the transmission conditions between subdomains introduce additional input-to-output maps that must be approximated. Additionally, applying an algebraic DD to a quadratic-bilinear (QB) system results in subdomain QB systems with bilinear outputs. This thesis provides a framework for computing interpolatory ROMs in the more general case of QB systems with QB outputs, and provides initial theoretical results for using interpolatory ROMs in the DD context.

Description
Degree
Doctor of Philosophy
Type
Thesis
Keywords
Reduced-order models, Domain decomposition, Nonlinear manifold, Sparse autoencoders, Neural networks, Least squares Petrov-Galerkin, Scientific machine learning
Citation
Has part(s)
Forms part of
Published Version
Rights
Link to license
Citable link to this page