Multi-Task Learning with User Preferences: Gradient Descent with Controlled Ascent in Pareto Optimization

Jul 12, 2020

Sprecher:innen

Über

Multi-Task Learning (MTL) is a well established learning paradigm for jointly learning models for multiple correlated tasks. Often the tasks conflict requiring trade-offs between them during optimization. Recent advances in multi-objective optimization based MTL have enabled us to use large-scale deep networks to find one or more Pareto optimal solutions. However, they cannot be used to find exact Pareto optimal solutions satisfying user-specified preferences with respect to task-specific losses, that is not only a common requirement in applications but also a useful way to explore the infinite set of Pareto optimal solutions. We develop the first gradient-based multi-objective MTL algorithm to address this problem. Our unique approach combines multiple gradient descent with carefully controlled ascent, that enables it to trace the Pareto front in a principled manner and makes it robust to initialization. Assuming only differentiability of the task-specific loss functions, we provide theoretical guarantees for convergence. We empirically demonstrate the superiority of our algorithm over state-of-the-art methods.

Organisator

Kategorien

Über ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Präsentation speichern

Soll diese Präsentation für 1000 Jahre gespeichert werden?

Wie speichern wir Präsentationen?

Ewigspeicher-Fortschrittswert: 0 = 0.0%

Freigeben

Empfohlene Videos

Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

Interessiert an Vorträgen wie diesem? ICML 2020 folgen