ChatPaper.aiChatPaper

適配器:用於參數高效和模塊化轉移學習的統一庫

Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning

November 18, 2023
作者: Clifton Poth, Hannah Sterz, Indraneil Paul, Sukannya Purkayastha, Leon Engländer, Timo Imhof, Ivan Vulić, Sebastian Ruder, Iryna Gurevych, Jonas Pfeiffer
cs.AI

摘要

我們介紹了 Adapters,這是一個開源庫,統一了大型語言模型中參數高效和模塊化的遷移學習。通過將 10 種不同的 adapter 方法集成到統一接口中,Adapters 提供了易用性和靈活的配置。我們的庫允許研究人員和從業者通過組合塊利用 adapter 模塊化,從而設計複雜的 adapter 設置。我們通過對比在各種自然語言處理任務上的完整微調性能,展示了該庫的有效性。Adapters 提供了一個強大的工具,用於應對傳統微調範式的挑戰,並促進更高效和模塊化的遷移學習。該庫可通過 https://adapterhub.ml/adapters 獲取。
English
We introduce Adapters, an open-source library that unifies parameter-efficient and modular transfer learning in large language models. By integrating 10 diverse adapter methods into a unified interface, Adapters offers ease of use and flexible configuration. Our library allows researchers and practitioners to leverage adapter modularity through composition blocks, enabling the design of complex adapter setups. We demonstrate the library's efficacy by evaluating its performance against full fine-tuning on various NLP tasks. Adapters provides a powerful tool for addressing the challenges of conventional fine-tuning paradigms and promoting more efficient and modular transfer learning. The library is available via https://adapterhub.ml/adapters.
PDF283December 15, 2024