Jul 24, 2023
Speaker · 0 followers
Speaker · 0 followers
Speaker · 0 followers
Speaker · 0 followers
Speaker · 3 followers
Evaluating the performance of machine learning models under distribution shifts is challenging, especiallywhen we only have unlabeled data from the shifted (target) domain,along with labeled data from the original (source) domain.Recent work suggests that the notion of disagreement, the degree to which two models trained with different randomness differ on the same input, is a key to tackling this problem. Experimentally, disagreement and prediction error have been shown to be strongly connected,which has been used to estimate model performance.Experiments have led to the discovery of the disagreement-on-the-line phenomenon, whereby the classification error under the target domain is often a linear function of the classification error under the source domain; and whenever this property holds, disagreement under the source and target domain follow the same linear relation.In this work, we develop a theoretical foundation for analyzing disagreement in high-dimensional random features regression; and study under what conditions the disagreement-on-the-line phenomenon occurs in our setting. Experiments on CIFAR-10-C, Tiny ImageNet-C, and Camelyon17 are consistent with our theory and support the universality of the theoretical findings.Evaluating the performance of machine learning models under distribution shifts is challenging, especiallywhen we only have unlabeled data from the shifted (target) domain,along with labeled data from the original (source) domain.Recent work suggests that the notion of disagreement, the degree to which two models trained with different randomness differ on the same input, is a key to tackling this problem. Experimentally, disagreement and prediction error have been shown to be strongly connected…
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker
Zexi Li, …
YuXin Wang, …
Khang Nguyen, …
Alexis Ayme, …
Chuyang Ke, …