Dec 13, 2019
We consider problems of smooth nonconvex optimization: unconstrained, bound-constrained, and with general equality constraints. We show that algorithms for these problems that are widely used in practice can be modified slightly in ways that guarantees convergence to approximate first- and second-order optimal points with complexity guarantees that depend on the desired accuracy. The methods we discuss are constructed from Newton's method, the conjugate gradient method, log-barrier method, and augmented Lagrangians. (In some cases, special structure of the objective function makes for only a weak dependence on the accuracy parameter.) Our methods require Hessian information only in the form of Hessian-vector products, so do not require the Hessian to be evaluated and stored explicitly. This talk describes joint work with Clement Royer, Yue Xie, and Michael O'Neill.
Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker