Dez 6, 2021
Flatness of the loss curve is conjectured to be connected to the generalization ability of machine learning models, in particular neural networks. Indeed, it has been empirically observed that flatness measures consistently correlate strongly with generalization. However, it is an open theoretical problem why and under which circumstances flatness is connected to generalization, in particular in light of reparameterizations that change certain flatness measures but leave generalization unchanged. This paper investigates this connection by relating it to the interpolation from representative data, deriving notions of representativeness and feature robustness. This allows to rigorously connect flatness and generalization and to identify conditions under which the connection holds. Moreover, these notions give rise to a novel, but natural relative flatness measure that correlates strongly with generalization, simplifies to ridge regression for ordinary least squares, and solves the reparameterization issue.Flatness of the loss curve is conjectured to be connected to the generalization ability of machine learning models, in particular neural networks. Indeed, it has been empirically observed that flatness measures consistently correlate strongly with generalization. However, it is an open theoretical problem why and under which circumstances flatness is connected to generalization, in particular in light of reparameterizations that change certain flatness measures but leave generalization unchanged…
Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.
Professionelle Aufzeichnung und Livestreaming – weltweit.
Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind
Heng Hao, …
Ewigspeicher-Fortschrittswert: 0 = 0.0%
Ewigspeicher-Fortschrittswert: 0 = 0.0%
Jiehong Lin, …
Ewigspeicher-Fortschrittswert: 0 = 0.0%
Ewigspeicher-Fortschrittswert: 0 = 0.0%
Ewigspeicher-Fortschrittswert: 0 = 0.0%
Aditya Hegde, …
Ewigspeicher-Fortschrittswert: 0 = 0.0%