Browsing by Author "Baraniuk, Richard G"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Data-Driven Computational Sensing(2018-04-30) Mousavi, Ali; Baraniuk, Richard GGreat progress has been made on sensing, perception, and signal processing over the last decades through the design of algorithms matched to the underlying physics and statistics of the task at hand. However, a host of difficult problems remain where the physics-based approach comes up short; for example, unrealistic image models stunt the performance of MRI and other computational imaging systems. Fortunately, the big data age has enabled the development of new kinds of machine learning algorithms that augment our understanding of the physics with models learned from large amounts of training data. In this thesis, we will overview three increasingly integrated physics+data algorithms for solving the kinds of inverse problems encountered in computational sensing. At the lowest level, data can be used to automatically tune the parameters of an optimization algorithm; improving its inferential and computational performance. At the next level, data can be used to learn a more realistic signal model that boosts the performance of an iterative recovery algorithm. At the highest level, data can be used to train a deep network to encapsulate the complete underlying physics of the sensing problem (i.e., not just the signal model but also the forward model that maps signals into measurements). We have shown that moving up the physics+data hierarchy increasingly exploits training data and boosts performance accordingly.Item Overparameterization and double descent in PCA, GANs, and Diffusion models(2024-04-19) Luzi, Lorenzo; Baraniuk, Richard GThis PhD thesis constitutes a synthesis of my doctoral work, which addresses various aspects of study related to generative modeling with a particular focus on overparameterization. Using a novel method we call pseudo-supervision, we investigate approaches toward characterization of overparameterization behaviors, including double descent, of GANs as well as PCA-like problems. Extending pseudo-supervision to diffusion models, we see that it can be used to create an inductive bias; we demonstrate that this allows us to train our model with lower generalization error and faster convergence time compared to the baseline. I additionally introduce a novel method called Boomerang to extend our study of diffusion models, showing that they can be used for local sampling in image manifolds. Finally, in an approach we titled WaM, I extend FID to include non-Gaussian distributions by using a Gaussian mixture model and a bound on the 2-Wasserstein metric for Gaussian mixture models to define a metric on non-Gaussian features.