A Trichotomy for Transductive Online Learning

Dec 10, 2023

Speakers

About

We present new upper and lower bounds on the number of learner mistakes in the 'transductive' online learning setting of Ben-David, Kushilevitz and Mansour (1997). This setting is similar to standard online learning, except that the adversary fixes a sequence of instances x_1,...,x_n to be labeled at the start of the game, and this sequence is known to the learner. Qualitatively, we prove a _trichotomy_, stating the the asymptotic minimal number of mistakes made by the learner as n grows can take only one of precisely three possible values: n, Θ(log (n)), or Θ(1). Furthermore, this behavior is determined by a combination of the VC dimension and the Littlestone dimension. Quantitatively, we show a variety of bounds relating the number of mistakes to well-known combinatorial dimensions. In particular, we improve the known lower bound on the constant in the Θ(1) case from Ω(√(log(d))) to Ω(log(d)) where d is the Littlestone dimension. Finally, we extend our results to cover multiclass classification and partial concept classes.

Organizer

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2023