Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Vector-output ReLU Neural Network Problems are Copositive Programs: Convex Analysis of Two Layer Networks and Polynomial-time Algorithms
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-011-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-011-alpha.b-cdn.net
      • sl-yoda-v3-stream-011-beta.b-cdn.net
      • 1150868944.rsc.cdn77.org
      • 1511650057.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Vector-output ReLU Neural Network Problems are Copositive Programs: Convex Analysis of Two Layer Networks and Polynomial-time Algorithms
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Vector-output ReLU Neural Network Problems are Copositive Programs: Convex Analysis of Two Layer Networks and Polynomial-time Algorithms

            May 3, 2021

            Speakers

            AS

            Arda Sahiner

            Speaker · 0 followers

            TE

            Tolga Ergen

            Speaker · 0 followers

            JP

            John Pauly

            Speaker · 0 followers

            About

            We describe the convex semi-infinite dual of the two-layer vector-output ReLU neural network training problem. This semi-infinite dual admits a finite dimensional representation, but its support is over a convex set which is difficult to characterize. In particular, we demonstrate that the non-convex neural network training problem is equivalent to a finite-dimensional convex copositive program. Our work is the first to identify this strong connection between the global optima of neural networks…

            Organizer

            I2
            I2

            ICLR 2021

            Account · 899 followers

            About ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Lossy Image Compression with Normalizing Flows
            05:10

            Lossy Image Compression with Normalizing Flows

            L. Helminger, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Greedy-GQ with Variance Reduction: Finite-Time Analysis and Improved Complexity
            02:59

            Greedy-GQ with Variance Reduction: Finite-Time Analysis and Improved Complexity

            Shaocong Ma, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            When do curricula work?
            14:37

            When do curricula work?

            Xiaoxia Shirley Wu, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            CoCon: A Self-Supervised Approach for Controlled Text Generation
            06:00

            CoCon: A Self-Supervised Approach for Controlled Text Generation

            Alvin Chan, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Neural Approximate Sufficient Statistics for Implicit Models
            10:09

            Neural Approximate Sufficient Statistics for Implicit Models

            Yanzhi Chen, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Boosting black-box adversarial attack via exploiting loss smoothness
            03:28

            Boosting black-box adversarial attack via exploiting loss smoothness

            Hoang A. Tran, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICLR 2021