Continual Learning in Linear Classification on Separable Data

Jul 24, 2023

Speakers

About

We analyze continual learning on a sequence of separable linear classification tasks with binary labels. We show theoretically that learning with weak regularization reduces to solving a sequential max-margin problem, corresponding to a special case of the Projection Onto Convex Sets (POCS) framework. We then develop upper bounds for the forgetting of sequential max-marginin various settings, including cyclic and random orderings of tasks. We discuss several practical implications to popular training practiceslike regularization scheduling and weighting.

Organizer

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ICML 2023