Sg8ZqQ9J6W@OpenReview

Total: 1

#1 RepLoRA: Reparameterizing Low-rank Adaptation via the Perspective of Mixture of Experts [PDF] [Copy] [Kimi] [REL]

Authors: Tuan Truong, Chau Nguyen, Huy Nguyen, Minh Le, Trung Le, Nhat Ho

Low-rank Adaptation (LoRA) has emerged as a powerful and efficient method for fine-tuning large-scale foundation models. Despite its popularity, the theoretical understanding of LoRA has remained underexplored. In this paper, we present a theoretical analysis of LoRA by examining its connection to the Mixture of Experts models. Under this framework, we show that a simple technique, reparameterizing LoRA matrices, can notably accelerate the low-rank matrix estimation process. In particular, we prove that reparameterization can reduce the data needed to achieve a desired estimation error from an exponential to a polynomial scale. Motivated by this insight, we propose **Rep**arameterized **Lo**w-**R**ank **A**daptation (RepLoRA), incorporating a lightweight MLP to reparameterize the LoRA matrices. Extensive experiments across multiple domains demonstrate that RepLoRA consistently outperforms vanilla LoRA. With limited data, RepLoRA surpasses LoRA by a substantial margin of up to **40.0%** and achieves LoRA's performance using only **30.0%** of the training data, highlighting the theoretical and empirical robustness of our PEFT method.

Subject: ICML.2025 - Poster