Parameter-Efficient Low-Resource Dialogue State Tracking by Prompt Tuning

2. Prosinec 2022

O prezentaci

Dialogue state tracking (DST) is an important step in dialogue management to keep track of users' beliefs. Existing works fine-tune all language model (LM) parameters to tackle the DST task, which requires significant data and computing resources for training and hosting. The cost grows exponentially in the real-world deployment where dozens of fine-tuned LM are used for different domains and tasks. To develop domain-specific models that better utilize slot-related information with less training data and fewer parameters, we propose to use soft prompt tokens to learn task properties, incorporate segment information and reiterate the task before predicting value. Without tuning LM parameters, our method drastically reduces the number of parameters needed to less than 0.5

Organizátor

Uložení prezentace

Měla by být tato prezentace uložena po dobu 1000 let?

Jak ukládáme prezentace

Pro uložení prezentace do věčného trezoru hlasovalo 0 diváků, což je 0.0 %

Sdílení

Doporučená videa

Prezentace na podobné téma, kategorii nebo přednášejícího

Zajímají Vás podobná videa? Sledujte NeurIPS 2022