Jul 24, 2023
Speaker · 0 followers
Speaker · 0 followers
Precision and Recall are two prominent measures of generative performance, which were proposed to address the challenge of separately measuring the fidelity and diversity of generative models. Given their central role in comparing, fine-tuning, and improving generative models, understanding their limitations are crucially important. To that end, in this work, we identify a critical flaw in these measures, namely that the very interpretation of fidelity and diversity that are assigned to Precision and Recall could fail in high dimensions, resulting in very misleading conclusions. Specifically, we empirically and theoretically show that as the number of dimensions grow, two model distributions that are at exactly the same point-wise distance from the real distribution, could be assigned vastly different Precision and Recall values, hence an emergent asymmetry in high dimensions. Based on our theoretical insights, we then provide a simple yet effective modification to these measures to construct a symmetric measure regardless of the number of dimensions. Finally, we provide experiments on real-world datasets to illustrate that the identified flaw is not merely a pathological case, and that our proposed measures are effective in overcoming it.Precision and Recall are two prominent measures of generative performance, which were proposed to address the challenge of separately measuring the fidelity and diversity of generative models. Given their central role in comparing, fine-tuning, and improving generative models, understanding their limitations are crucially important. To that end, in this work, we identify a critical flaw in these measures, namely that the very interpretation of fidelity and diversity that are assigned to Precisio…
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker
Liang Li, …
Zhiao Huang, …
Mark Crowley, …