ZipLoRA:通過有效合併LoRA,以任何主題和風格。
ZipLoRA: Any Subject in Any Style by Effectively Merging LoRAs
November 22, 2023
作者: Viraj Shah, Nataniel Ruiz, Forrester Cole, Erika Lu, Svetlana Lazebnik, Yuanzhen Li, Varun Jampani
cs.AI
摘要
針對以概念驅動的個性化進行微調生成模型的方法,通常能夠取得對於以主題或風格驅動的生成具有強大效果。最近,低秩適應(LoRA)被提出作為實現以概念驅動的個性化的一種參數高效方式。儘管最近的研究探討了結合獨立LoRA以實現學習風格和主題的聯合生成,現有技術並未可靠地解決問題;它們往往會在主題忠實度或風格忠實度之間進行折衷。我們提出了ZipLoRA,一種方法,可以便宜且有效地合併獨立訓練的風格和主題LoRA,以實現在任何用戶提供的風格中生成任何用戶提供的主題。對於廣泛範圍的主題和風格組合進行的實驗顯示,ZipLoRA能夠生成引人入勝的結果,並在主題和風格忠實度方面顯示出有意義的改進,同時保留了重新語境化的能力。專案頁面:https://ziplora.github.io
English
Methods for finetuning generative models for concept-driven personalization
generally achieve strong results for subject-driven or style-driven generation.
Recently, low-rank adaptations (LoRA) have been proposed as a
parameter-efficient way of achieving concept-driven personalization. While
recent work explores the combination of separate LoRAs to achieve joint
generation of learned styles and subjects, existing techniques do not reliably
address the problem; they often compromise either subject fidelity or style
fidelity. We propose ZipLoRA, a method to cheaply and effectively merge
independently trained style and subject LoRAs in order to achieve generation of
any user-provided subject in any user-provided style. Experiments on a wide
range of subject and style combinations show that ZipLoRA can generate
compelling results with meaningful improvements over baselines in subject and
style fidelity while preserving the ability to recontextualize. Project page:
https://ziplora.github.io