Self-Consuming Generative Models Go MAD

Date
2024-07-30
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract

Seismic advances in generative AI algorithms for imagery, text, and other data types has led to the temptation to use synthetic data to train next-generation models. Repeating this process creates an autophagous (self-consuming) loop whose properties are poorly understood. We conduct a thorough analytical and empirical analysis using state-of-the-art generative image models of three families of autophagous loops that differ in how fixed or fresh real training data is available through the generations of training and in whether the samples from previous generation models have been biased to trade off data quality versus diversity. Our primary conclusion across all scenarios is that without enough fresh real data in each generation of an autophagous loop, future generative models are doomed to have their quality (precision) or diversity (recall) progressively decrease. We term this condition Model Autophagy Disorder (MAD), making analogy to mad cow disease.

Description
Degree
Master of Science
Type
Thesis
Keywords
generative models, artificial intelligence, AI, self-consuming, autophagous, self-training, madness, model autophagy disorder, image models
Citation

Casco-Rodriguez, Josue. Self-Consuming Generative Models Go MAD. (2024). Masters thesis, Rice University. https://hdl.handle.net/1911/117804

Has part(s)
Forms part of
Published Version
Rights
Copyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.
Link to license
Citable link to this page