Analysis of line search methods of various gradient approximation schemes for noisy DFO

Dez 13, 2019

Sprecher:innen

Über

We develop convergence analysis of a modified line search method for objective functions whose value is computed with noise and whose gradient estimates are not directly available. The noise is assumed to be bounded in absolute value without any additional assumptions. In this case, gradient approximation can be constructed via interpolation or sample average approximation of smoothing gradients and thus they are always inexact and possibly random. We extend the framework based on stochastic methods which was developed to provide analysis of a standard line-search method with exact function values and random gradients to the case of noisy function. We introduce a condition on the gradient which when satisfied with some sufficiently large probability at each iteration, guarantees convergence properties of the line search method. We derive expected complexity bounds for convex, strongly convex and nonconvex functions. We motivate these results with several recent papers related to policy optimization.

Organisator

Kategorien

Über NIPS 2019

Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

Präsentation speichern

Soll diese Präsentation für 1000 Jahre gespeichert werden?

Wie speichern wir Präsentationen?

Ewigspeicher-Fortschrittswert: 0 = 0.0%

Freigeben

Empfohlene Videos

Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

Interessiert an Vorträgen wie diesem? NIPS 2019 folgen