This post provides an overview of various variants of LoRA, a technique used to train large language models efficiently. It explains the basic concept of LoRA and introduces variants such as LoRA+, VeRA, LoRA-FA, LoRA-drop, AdaLoRA, DoRA, and Delta-LoRA. Each variant is described briefly along with its unique features and improvements. The post concludes by mentioning that the field of research on LoRA and related methods is continuously evolving with new breakthroughs expected in the future.

18m read timeFrom towardsdatascience.com
Post cover image
Table of contents
An Overview of the LoRA FamilyLoraLoRA+VeRALoRA-FALoRa-dropAdaLoRADoRADelta-LoRASummaryReferences and Further Reading

Sort: