Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Effective Neural Topic Modeling with Embedding Clustering Regularization
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-008-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-008-alpha.b-cdn.net
      • sl-yoda-v2-stream-008-beta.b-cdn.net
      • 1159783934.rsc.cdn77.org
      • 1511376917.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Effective Neural Topic Modeling with Embedding Clustering Regularization
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Effective Neural Topic Modeling with Embedding Clustering Regularization

            Jul 24, 2023

            Speakers

            XW

            Xiaobao Wu

            Sprecher:in · 0 Follower:innen

            XD

            Xinshuai Dong

            Sprecher:in · 0 Follower:innen

            TN

            Thong Nguyen

            Sprecher:in · 0 Follower:innen

            About

            Topic models have been prevalent for decades with various applications. However, existing topic models commonly suffer from the notorious topic collapsing: discovered topics semantically collapse towards each other, leading to highly repetitive topics, insufficient topic discovery, and damaged model interpretability. In this paper, we propose a new neural topic model, Embedding Clustering Regularization Topic Model (ECRTM). Besides the existing reconstruction error, we propose a novel Embedding…

            Organizer

            I2
            I2

            ICML 2023

            Konto · 657 Follower:innen

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            D2Match: Leveraging Deep Learning and Degeneracy for Subgraph Matching
            05:04

            D2Match: Leveraging Deep Learning and Degeneracy for Subgraph Matching

            Xuanzhou Liu, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Nearly Minimax Optimal Regret for Learning Linear Mixture Stochastic Shortest Path
            05:08

            Nearly Minimax Optimal Regret for Learning Linear Mixture Stochastic Shortest Path

            Qiwei Di, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Moderately Distributional Exploration for Domain Generalization
            04:38

            Moderately Distributional Exploration for Domain Generalization

            Rui Dai, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG Signals
            05:02

            Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG Signals

            Clément Bonet, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Why Random Pruning Is All We Need to Start Sparse
            05:23

            Why Random Pruning Is All We Need to Start Sparse

            Advait Gadhikar, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Equivariance with Learned Canonicalization Functions
            05:15

            Equivariance with Learned Canonicalization Functions

            Sékou-Oumar Kaba, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interested in talks like this? Follow ICML 2023