Jul 12, 2020
In the context of an empirical Bayes model for meta-learning where a subset of model parameters is treated as latent variables, we propose a novel scheme for amortized variational inference. This approach is based on the conditional variational autoencoder framework, which allows to learn the conditional prior distribution over model parameters given limited training data. In our model, we share the same amortized inference network between the prior and posterior distributions over the model parameters. While the posterior inference leverages both the test and the train data, including the labels, the prior inference is based on the train data only. We show that in earlier approaches based on Monte-Carlo approximation the prior collapses to a Dirac delta function. In contrast, our variational approach prevents this collapse and preserves uncertainty over the model parameters. We evaluate our approach on standard benchmark datasets, including miniImageNet, and obtain results demonstrating the advantage of our approach over previous work.
The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Presentations on similar topic, category or speaker