The Bayesian Stability Zoo

Dec 10, 2023



We show that many definitions of stability appearing in the learning theory literature are equivalent to each other. We distinguish two families of definitions of stability, which we call _distribution-dependent_ and _distribution-independent Bayesian stability_, and show equivalences between definitions within each family. Our results cover approximate differential privacy, pure differential privacy, replicability, global stability, perfect generalization, 𝖳𝖵 indistinguishability, mutual information stability and 𝖪𝖫 divergence stability. Along the way, we prove boosting results that enable amplification of the stability of an algorithm. This work is a step towards a more systematic taxonomy of stability notions in learning theory, which can promote clarity and improve understanding with respect to the many notions of stability appearing in the literature in recent years.


Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%


Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2023