title: Speed-optimized, Compact Student Models that Distill Knowledge from a Larger Teacher Model: the UEDIN-CUNI Submission to the WMT 2020 News Translation Task
Speed-optimized, Compact Student Models that Distill Knowledge from a Larger Teacher Model: the UEDIN-CUNI Submission to the WMT 2020 News Translation Task
Settings
Sync diff
Quality
Settings
Server
Quality
Server
Speed-optimized, Compact Student Models that Distill Knowledge from a Larger Teacher Model: the UEDIN-CUNI Submission to the WMT 2020 News Translation Task
Nov 16, 2020
Sprecher:innen
Organisator
Kategorien
Über EMNLP 2020
The 2020 Conference on Empirical Methods in Natural Language Processing
Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!
Professionelle Aufzeichnung und Livestreaming – weltweit.