Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Cryptographic Hardness of Learning Single Periodic Neurons
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-015-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-015-alpha.b-cdn.net
      • sl-yoda-v3-stream-015-beta.b-cdn.net
      • 1963568160.rsc.cdn77.org
      • 1940033649.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Cryptographic Hardness of Learning Single Periodic Neurons
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Cryptographic Hardness of Learning Single Periodic Neurons

            Dez 6, 2021

            Sprecher:innen

            MJS

            Min Jae Song

            Sprecher:in · 0 Follower:innen

            IZ

            Ilias Zadik

            Sprecher:in · 0 Follower:innen

            JB

            Joan Bruna

            Sprecher:in · 4 Follower:innen

            Über

            We show a simple reduction which demonstrates the cryptographic hardness of learning a single periodic neuron over isotropic Gaussian distributions in the presence of noise. More precisely, our reduction shows that any polynomial-time algorithm (not necessarily gradient-based) for learning such functions under small noise implies a polynomial-time quantum algorithm for solving worst-case lattice problems which form the foundation of lattice-based cryptography. Our core hard family of functions,…

            Organisator

            N2
            N2

            NeurIPS 2021

            Konto · 1,9k Follower:innen

            Über NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Kronecker Decomposition for GPT Compression
            05:12

            Kronecker Decomposition for GPT Compression

            Ali Edalati, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Good Classification Measures and How to Find Them
            13:26

            Good Classification Measures and How to Find Them

            Martijn Gösgens, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Regularization in ResNet with Stochastic Depth
            13:45

            Regularization in ResNet with Stochastic Depth

            Soufiane Hayou, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Quantifying Model Predictive Uncertainty with Perturbation Theory
            05:02

            Quantifying Model Predictive Uncertainty with Perturbation Theory

            Rishabh Singh, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Learning latent causal graphs via mixture oracles
            12:33

            Learning latent causal graphs via mixture oracles

            Bohdan Kivva, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Can Less be More? When Increasing-to-Balancing Label Noise Rates Considered Beneficial
            14:49

            Can Less be More? When Increasing-to-Balancing Label Noise Rates Considered Beneficial

            Yang Liu, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? NeurIPS 2021 folgen