Další
Živý přenos začne již brzy!
Živý přenos již skončil.
Prezentace ještě nebyla nahrána!
  • title: Real-Time Optimisation for Online Learning in Auctions
      0:00 / 0:00
      • Nahlásit chybu
      • Nastavení
      • Playlisty
      • Záložky
      • Titulky Off
      • Rychlost přehrávání
      • Kvalita
      • Nastavení
      • Debug informace
      • Server sl-yoda-v3-stream-012-alpha.b-cdn.net
      • Velikost titulků Střední
      • Záložky
      • Server
      • sl-yoda-v3-stream-012-alpha.b-cdn.net
      • sl-yoda-v3-stream-012-beta.b-cdn.net
      • 1338956956.rsc.cdn77.org
      • 1656830687.rsc.cdn77.org
      • Titulky
      • Off
      • en
      • Rychlost přehrávání
      • Kvalita
      • Velikost titulků
      • Velké
      • Střední
      • Malé
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      Moje playlisty
        Záložky
          00:00:00
            Real-Time Optimisation for Online Learning in Auctions
            • Nastavení
            • Sync diff
            • Kvalita
            • Nastavení
            • Server
            • Kvalita
            • Server

            Real-Time Optimisation for Online Learning in Auctions

            12. července 2020

            Řečníci

            LC

            Lorenzo Croissant

            Sprecher:in · 0 Follower:innen

            MA

            Marc Abeille

            Sprecher:in · 0 Follower:innen

            CC

            Clément Calauzènes

            Sprecher:in · 0 Follower:innen

            O prezentaci

            In display advertising, a small group of sellers and bidders face each other in up to 10^12 auctions a day. In this context, revenue maximisation via monopoly price learning is a high-value problem for sellers. By nature, these auctions are online and produce a very high frequency stream of data. This results in a computational strain that requires algorithms be real-time. Unfortunately, existing methods, inherited from the batch setting, suffer O(sqrt( t )) time/memory complexity at each update…

            Organizátor

            I2
            I2

            ICML 2020

            Konto · 2,7k Follower:innen

            Kategorie

            Wirtschaft und Finanzen

            Kategorie · 755 Präsentationen

            O organizátorovi (ICML 2020)

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Baví vás formát? Nechte SlidesLive zachytit svou akci!

            Profesionální natáčení a streamování po celém světě.

            Sdílení

            Doporučená videa

            Prezentace na podobné téma, kategorii nebo přednášejícího

            Generalization and Representational Limits of Graph Neural Networks
            12:42

            Generalization and Representational Limits of Graph Neural Networks

            Vikas K Garg, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Reverse-engineering deep ReLU networks
            14:09

            Reverse-engineering deep ReLU networks

            David Rolnick, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Coresets for Data-efficient Training of Machine Learning Models
            16:15

            Coresets for Data-efficient Training of Machine Learning Models

            Baharan Mirzasoleiman, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Non-Autoregressive Neural Text-to-Speech
            15:12

            Non-Autoregressive Neural Text-to-Speech

            Kainan Peng, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Semi-Supervised StyleGAN for Disentanglement Learning
            16:02

            Semi-Supervised StyleGAN for Disentanglement Learning

            Weili Nie, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Which Tasks Should Be Learned Together in Multi-task Learning?
            15:06

            Which Tasks Should Be Learned Together in Multi-task Learning?

            Trevor Standley, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 1 = 0.1%

            Zajímají Vás podobná videa? Sledujte ICML 2020