Variance-Reduced Gradient Estimation via Noise-Reuse in Online Evolution Strategies

Dec 10, 2023

Speakers

About

Unrolled computation graphs are prevalent throughout machine learning but present challenges to automatic differentiation (AD) gradient estimation methods when the loss functions are chaotic, discontinuous, or blackbox. For such scenarios, online evolution strategies methods are a more capable alternative, with the additional ability to generate more frequent gradient updates than vanilla evolution strategies (ES). In this work, we propose a general class of unbiased online evolution strategies methods. We analytically and empirically characterize the variance of this class of gradient estimators and identify the one with the least variance, which we term Noise-Reuse Evolution Strategies (NRES). Experimentally, we show NRES results in faster convergence than existing AD and ES methods in terms of wall-clock speed and number of unroll steps across a variety of applications, including learning dynamical systems, meta-training learned optimizers, and reinforcement learning.

Organizer

Like the format? Trust SlidesLive to capture your next event!

Professional recording and live streaming, delivered globally.

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2023