Baraniuk, Richard G.2022-10-052022-10-052022-052022-05-09May 2022Alemohammad, Sina. "The Recurrent Neural Tangent Kernel." (2022) Master’s Thesis, Rice University. <a href="https://hdl.handle.net/1911/113528">https://hdl.handle.net/1911/113528</a>.https://hdl.handle.net/1911/113528The study of deep neural networks (DNNs) in the infinite-width limit, via the so-called neural tangent kernel (NTK) approach, has provided new insights into the dynamics of learning, generalization, and the impact of initialization. One key DNN architecture remains to be kernelized, namely, the recurrent neural network (RNN). In this thesis we introduce and study the Recurrent Neural Tangent Kernel (RNTK), which provides new insights into the behavior of overparametrized RNNs. A key property of the RNTK should greatly benefit practitioners is its ability to compare inputs of different length. To this end, we characterize how the RNTK weights different time steps to form its output under different initialization parameters and nonlinearity choices. A synthetic and 56 real-world data experiments demonstrate that the RNTK offers significant performance gains over other kernels, including standard NTKs, across a wide array of data sets.application/pdfengCopyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.Neural Tangent KernelRecurrent Neural NetworksThe Recurrent Neural Tangent KernelThesis2022-10-05