ChatPaper.aiChatPaper

SALT:基于低秩变换的奇异值自适应

SALT: Singular Value Adaptation with Low-Rank Transformation

March 20, 2025
作者: Abdelrahman Elsayed, Sarim Hashmi, Mohammed Elseiagy, Hu Wang, Mohammad Yaqub, Ibrahim Almakky
cs.AI

摘要

医学图像分割的复杂性要求模型能够专门捕捉细致、领域特定的特征。大型基础模型虽提供了显著的灵活性,但其微调成本仍是一大障碍。参数高效微调(PEFT)方法,如低秩适应(LoRA),通过低秩矩阵高效更新模型权重,但在所选秩不足以捕捉领域特定细节时,可能面临欠拟合问题。相反,基于全秩奇异值分解(SVD)的方法通过修改所有奇异值提供全面更新,却常缺乏灵活性,且在不同数据集上表现不一。我们提出SALT(奇异值适应与低秩变换),该方法利用可训练的缩放和位移参数,选择性地调整最具影响力的奇异值,同时辅以对剩余子空间的低秩更新。这种混合方法结合了LoRA和SVD的优势,实现了无需增加模型规模或深度的有效适应。在涵盖20至1000样本的5个具有挑战性的医学数据集上评估,SALT仅使用3.9%的可训练参数,在Dice系数上比最先进的PEFT(LoRA和SVD)高出2%至5%,展现了在低资源环境下的强大适应能力。SALT的代码已发布于:https://github.com/BioMedIA-MBZUAI/SALT。
English
The complex nature of medical image segmentation calls for models that are specifically designed to capture detailed, domain-specific features. Large foundation models offer considerable flexibility, yet the cost of fine-tuning these models remains a significant barrier. Parameter-Efficient Fine-Tuning (PEFT) methods, such as Low-Rank Adaptation (LoRA), efficiently update model weights with low-rank matrices but may suffer from underfitting when the chosen rank is insufficient to capture domain-specific nuances. Conversely, full-rank Singular Value Decomposition (SVD) based methods provide comprehensive updates by modifying all singular values, yet they often lack flexibility and exhibit variable performance across datasets. We propose SALT (Singular Value Adaptation with Low-Rank Transformation), a method that selectively adapts the most influential singular values using trainable scale and shift parameters while complementing this with a low-rank update for the remaining subspace. This hybrid approach harnesses the advantages of both LoRA and SVD, enabling effective adaptation without relying on increasing model size or depth. Evaluated on 5 challenging medical datasets, ranging from as few as 20 samples to 1000, SALT outperforms state-of-the-art PEFT (LoRA and SVD) by 2% to 5% in Dice with only 3.9% trainable parameters, demonstrating robust adaptation even in low-resource settings. The code for SALT is available at: https://github.com/BioMedIA-MBZUAI/SALT

Summary

AI-Generated Summary

PDF82March 21, 2025