ChatPaper.aiChatPaper

StrandDesigner:面向实用发丝生成的草图引导方法

StrandDesigner: Towards Practical Strand Generation with Sketch Guidance

August 3, 2025
作者: Na Zhang, Moran Li, Chengming Xu, Han Feng, Xiaobin Hu, Jiangning Zhang, Weijian Cao, Chengjie Wang, Yanwei Fu
cs.AI

摘要

逼真的发丝生成对于计算机图形学和虚拟现实等应用至关重要。尽管扩散模型能够根据文本或图像生成发型,但这些输入方式在精确性和用户友好性方面存在不足。为此,我们提出了首个基于草图的发丝生成模型,该模型在保持用户友好性的同时提供了更精细的控制。我们的框架通过两大创新解决了关键挑战,如复杂发丝交互建模和多样化草图模式处理:一是可学习的发丝上采样策略,将三维发丝编码至多尺度潜在空间;二是采用带有扩散头的Transformer实现的多尺度自适应条件机制,确保不同粒度级别间的一致性。在多个基准数据集上的实验表明,我们的方法在真实感和精确度上均优于现有技术。定性结果进一步验证了其有效性。代码将在[GitHub](https://github.com/fighting-Zhang/StrandDesigner)发布。
English
Realistic hair strand generation is crucial for applications like computer graphics and virtual reality. While diffusion models can generate hairstyles from text or images, these inputs lack precision and user-friendliness. Instead, we propose the first sketch-based strand generation model, which offers finer control while remaining user-friendly. Our framework tackles key challenges, such as modeling complex strand interactions and diverse sketch patterns, through two main innovations: a learnable strand upsampling strategy that encodes 3D strands into multi-scale latent spaces, and a multi-scale adaptive conditioning mechanism using a transformer with diffusion heads to ensure consistency across granularity levels. Experiments on several benchmark datasets show our method outperforms existing approaches in realism and precision. Qualitative results further confirm its effectiveness. Code will be released at [GitHub](https://github.com/fighting-Zhang/StrandDesigner).
PDF63August 8, 2025