A Dimensionality Reduction Method for Finding Least Favorable Priors with a Focus on Bregman Divergence

Mar 28, 2022

Speakers

About

A common way of characterizing minimax estimators in point estimation is by moving the problem into the Bayesian estimation domain and finding a least-favorable distribution. The Bayesian estimator induced by a least-favorable prior, under mild conditions, is then known to be minimax. However, finding least favorable distributions can be challenging due to inherent optimization over the space of probability distributions, which is infinite-dimensional. This paper develops a dimensionality reduction method that allows us to move the optimization to a finite-dimensional setting with an explicit bound on the dimension. The benefit of this dimensionality reduction is that it permits one to use popular algorithms such a gradient descent to find least-favorable distributions. Throughout the paper, in order to make progress on the problem, we restrict ourselves to the Bayesian risks induced by a relatively large class of loss functions, namely Bregman divergences.

Organizer

About AISTATS 2022

AISTATS is an interdisciplinary gathering of researchers at the intersection of computer science, artificial intelligence, machine learning, statistics, and related areas. Since its inception in 1985, the primary goal of AISTATS has been to broaden research in these fields by promoting the exchange of ideas among them. We encourage the submission of all papers which are in keeping with this objective at AISTATS.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow AISTATS 2022