Jul 24, 2023
Speaker · 0 followers
Speaker · 2 followers
Speaker · 0 followers
Speaker · 0 followers
When two different parties use the same learning rule on their own data, how can we test whether the distributions of the two outcomes are similar? In this paper, we study the similarity of outcomes of learning rules through the lens of the Total Variation (TV) distance of distributions. We say that a learning rule is TV indistinguishable if the expected TV distance between the posterior distributions of its outputs, executed on two training data sets drawn independently from the same distribution, is small. We first investigate the learnability of hypothesis classes using TV indistinguishable learners. Our main results are information-theoretic equivalences between TV indistinguishability and existing algorithmic stability notions such as replicability and approximate differential privacy.Second, we provide statistical amplification and boosting algorithms for TV indistinguishable learners. En route, we generalize results of (Ghazi et al., 2021) and improve the sample complexity of algorithms appearing in (Impagliazzo et al., 2022).When two different parties use the same learning rule on their own data, how can we test whether the distributions of the two outcomes are similar? In this paper, we study the similarity of outcomes of learning rules through the lens of the Total Variation (TV) distance of distributions. We say that a learning rule is TV indistinguishable if the expected TV distance between the posterior distributions of its outputs, executed on two training data sets drawn independently from the same distributi…
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker
Jiaxiang Ren, …
Jun Zhang, …
Adrian Bayer, …
Zhengxuan Wu, …