Dec 2, 2022
Speaker · 0 followers
Speaker · 0 followers
Speaker · 0 followers
Speaker · 0 followers
With the ever-growing size of pre-trained models (PMs), fine-tuning has become more expensive and resource hungry. As a remedy, low-rank adapters (LoRA) keep the main pre-trained weights of the model frozen and just introduce some learnable truncated SVD modules (so called LoRA blocks) to the model. While LoRA blocks are parameter efficient, they suffer from two major problems: first, the size of these blocks is fixed and cannot be modified after training (for example if we need to change the rank of LoRA blocks, then we need to train them from scratch); second, optimizing their rank requires an exhaustive search. In this work, we introduce a dynamic low rank adaptation (DyLoRA) solution to address these two problems together. Our DyLoRA method trains LoRA blocks for a range of ranks instead of a single rank by sorting out the representation learned at different ranks during training. We evaluate our solution on different tasks in the GLUE benchmark using the RoBERTa model. Our results show that we can train DyLoRA at least 7x faster than LoRA without compromising the performance significantly. Moreover, our models can perform consistently well on a much larger range of ranks compared to LoRA.With the ever-growing size of pre-trained models (PMs), fine-tuning has become more expensive and resource hungry. As a remedy, low-rank adapters (LoRA) keep the main pre-trained weights of the model frozen and just introduce some learnable truncated SVD modules (so called LoRA blocks) to the model. While LoRA blocks are parameter efficient, they suffer from two major problems: first, the size of these blocks is fixed and cannot be modified after training (for example if we need to change the ra…
Account · 952 followers
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker
Felix Biggs, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Chirag Raman, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Yizhou Zhang, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Conglong Li, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Xiang Gu, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Hao Lu, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%