Dec 10, 2019
- On Exact Computation with an Infinitely Wide Neural Net - Generalization Bounds of Stochastic Gradient Descent for Wide and Deep Neural Networks - Efficient and Accurate Estimation of Lipschitz Constants for Deep Neural Networks - Towards Explaining the Regularization Effect of Initial Large Learning Rate in Training Neural Networks
Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Presentations on similar topic, category or speaker