ZipLoRA:通过有效地合并LoRA,以任何风格处理任何主题
ZipLoRA: Any Subject in Any Style by Effectively Merging LoRAs
November 22, 2023
作者: Viraj Shah, Nataniel Ruiz, Forrester Cole, Erika Lu, Svetlana Lazebnik, Yuanzhen Li, Varun Jampani
cs.AI
摘要
用于概念驱动个性化微调生成模型的方法通常在主题驱动或风格驱动生成方面取得了良好的结果。最近,低秩适应(LoRA)被提出作为实现概念驱动个性化的一种参数高效的方法。虽然最近的研究探讨了结合单独的LoRA以实现学习风格和主题的联合生成,但现有技术并未可靠地解决这个问题;它们经常在主题保真度或风格保真度之间进行妥协。我们提出了ZipLoRA,一种方法,可以廉价且有效地合并独立训练的风格和主题LoRA,以实现在任何用户提供的风格中生成任何用户提供的主题。对各种主题和风格组合的实验显示,ZipLoRA能够生成引人注目的结果,在主题和风格保真度方面有显著改进,同时保留了重新语境化的能力。项目页面:https://ziplora.github.io
English
Methods for finetuning generative models for concept-driven personalization
generally achieve strong results for subject-driven or style-driven generation.
Recently, low-rank adaptations (LoRA) have been proposed as a
parameter-efficient way of achieving concept-driven personalization. While
recent work explores the combination of separate LoRAs to achieve joint
generation of learned styles and subjects, existing techniques do not reliably
address the problem; they often compromise either subject fidelity or style
fidelity. We propose ZipLoRA, a method to cheaply and effectively merge
independently trained style and subject LoRAs in order to achieve generation of
any user-provided subject in any user-provided style. Experiments on a wide
range of subject and style combinations show that ZipLoRA can generate
compelling results with meaningful improvements over baselines in subject and
style fidelity while preserving the ability to recontextualize. Project page:
https://ziplora.github.io