Invited Talk: Understanding the Role of Optimism in Minimax Optimization: A Proximal Point Approach

Dec 14, 2019

Speakers

About

In this talk, we consider solving saddle point problems, and, in particular, we discuss the concept of “optimism” or “negative momentum” - a technique which is observed to have superior empirical performance in training GANs. The goal of this talk is to provide a theoretical understanding on why optimism helps, in particular why the Optimistic Gradient Descent Ascent (OGDA) algorithm performs well in practice. To do so, we first consider the classical Proximal Point algorithm which is an implicit algorithm to solve this problem. We then show that OGDA inherently tries to approximate the proximal point method, and this is the rationale behind the ‘’negative momentum” term in the update of OGDA. This proximal point approximation viewpoint also enables us to provide a much simpler analysis of another well studied algorithm - the Extra-Gradient (EG) method.

Organizer

Categories

About NIPS 2019

Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NIPS 2019