Jul 24, 2023
We analyze continual learning on a sequence of separable linear classification tasks with binary labels. We show theoretically that learning with weak regularization reduces to solving a sequential max-margin problem, corresponding to a special case of the Projection Onto Convex Sets (POCS) framework. We then develop upper bounds for the forgetting of sequential max-marginin various settings, including cyclic and random orderings of tasks. We discuss several practical implications to popular training practiceslike regularization scheduling and weighting.
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Presentations on similar topic, category or speaker