Jul 2, 2022
Speaker · 2 followers
Speaker · 0 followers
Non-Gaussian Component Analysis (NGCA) is the following distribution learning problem: Given i.i.d. samples from a distribution on ^d that is non-gaussian in a hidden direction v and an independent standard Gaussian in the orthogonal directions, the goal is to approximate the hidden direction v. Prior work <cit.> provided formal evidence for the existence of an information-computation tradeoff for NGCA under appropriate moment-matching conditions on the univariate non-gaussian distribution A. The latter result does not apply when the distribution A is discrete. A natural question is whether information-computation tradeoffs persist in this setting. In this paper, we answer this question in the negative by obtaining a sample and computationally efficient algorithm for NGCA in the regime that A is discrete or nearly discrete, in a well-defined technical sense. The key tool leveraged in our algorithm is the LLL method <cit.> for lattice basis reduction.Non-Gaussian Component Analysis (NGCA) is the following distribution learning problem: Given i.i.d. samples from a distribution on ^d that is non-gaussian in a hidden direction v and an independent standard Gaussian in the orthogonal directions, the goal is to approximate the hidden direction v. Prior work <cit.> provided formal evidence for the existence of an information-computation tradeoff for NGCA under appropriate moment-matching conditions on the univariate non-gaussian distri…
The conference is held annually since 1988 and has become the leading conference on Learning theory by maintaining a highly selective process for submissions. It is committed in high-quality articles in all theoretical aspects of machine learning and related topics.
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker
Haim Kaplan, …