Singular Value Perturbation and Deep Network Optimization

dc.citation.journalTitleConstructive Approximationen_US
dc.contributor.authorRiedi, Rudolf H.en_US
dc.contributor.authorBalestriero, Randallen_US
dc.contributor.authorBaraniuk, Richard G.en_US
dc.date.accessioned2022-12-13T19:11:30Zen_US
dc.date.available2022-12-13T19:11:30Zen_US
dc.date.issued2022en_US
dc.description.abstractWe develop new theoretical results on matrix perturbation to shed light on the impact of architecture on the performance of a deep network. In particular, we explain analytically what deep learning practitioners have long observed empirically: the parameters of some deep architectures (e.g., residual networks, ResNets, and Dense networks, DenseNets) are easier to optimize than others (e.g., convolutional networks, ConvNets). Building on our earlier work connecting deep networks with continuous piecewise-affine splines, we develop an exact local linear representation of a deep network layer for a family of modern deep networks that includes ConvNets at one end of a spectrum and ResNets, DenseNets, and other networks with skip connections at the other. For regression and classification tasks that optimize the squared-error loss, we show that the optimization loss surface of a modern deep network is piecewise quadratic in the parameters, with local shape governed by the singular values of a matrix that is a function of the local linear representation. We develop new perturbation results for how the singular values of matrices of this sort behave as we add a fraction of the identity and multiply by certain diagonal matrices. A direct application of our perturbation results explains analytically why a network with skip connections (such as a ResNet or DenseNet) is easier to optimize than a ConvNet: thanks to its more stable singular values and smaller condition number, the local loss surface of such a network is less erratic, less eccentric, and features local minima that are more accommodating to gradient-based optimization. Our results also shed new light on the impact of different nonlinear activation functions on a deep network’s singular values, regardless of its architecture.en_US
dc.identifier.citationRiedi, Rudolf H., Balestriero, Randall and Baraniuk, Richard G.. "Singular Value Perturbation and Deep Network Optimization." <i>Constructive Approximation,</i> (2022) Springer Nature: https://doi.org/10.1007/s00365-022-09601-5.en_US
dc.identifier.digitals00365-022-09601-5en_US
dc.identifier.doihttps://doi.org/10.1007/s00365-022-09601-5en_US
dc.identifier.urihttps://hdl.handle.net/1911/114113en_US
dc.language.isoengen_US
dc.publisherSpringer Natureen_US
dc.rightsThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.en_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.titleSingular Value Perturbation and Deep Network Optimizationen_US
dc.typeJournal articleen_US
dc.type.dcmiTexten_US
dc.type.publicationpublisher versionen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
s00365-022-09601-5.pdf
Size:
1.61 MB
Format:
Adobe Portable Document Format