ChatPaper.aiChatPaper

StrandDesigner:基於草圖引導的實用髮絲生成技術

StrandDesigner: Towards Practical Strand Generation with Sketch Guidance

August 3, 2025
作者: Na Zhang, Moran Li, Chengming Xu, Han Feng, Xiaobin Hu, Jiangning Zhang, Weijian Cao, Chengjie Wang, Yanwei Fu
cs.AI

摘要

真實感髮絲生成對於計算機圖形學和虛擬現實等應用至關重要。雖然擴散模型能夠從文本或圖像生成髮型,但這些輸入方式在精確性和用戶友好性方面存在不足。為此,我們提出了首個基於草圖的髮絲生成模型,該模型在保持用戶友好性的同時,提供了更精細的控制。我們的框架通過兩項主要創新解決了關鍵挑戰,如模擬複雜的髮絲交互和多樣的草圖模式:一是可學習的髮絲上採樣策略,將三維髮絲編碼到多尺度潛在空間中;二是採用帶有擴散頭的變壓器實現的多尺度自適應條件機制,以確保不同粒度層次的一致性。在多個基準數據集上的實驗表明,我們的方法在真實感和精確性上均優於現有技術。定性結果進一步證實了其有效性。代碼將在[GitHub](https://github.com/fighting-Zhang/StrandDesigner)上發布。
English
Realistic hair strand generation is crucial for applications like computer graphics and virtual reality. While diffusion models can generate hairstyles from text or images, these inputs lack precision and user-friendliness. Instead, we propose the first sketch-based strand generation model, which offers finer control while remaining user-friendly. Our framework tackles key challenges, such as modeling complex strand interactions and diverse sketch patterns, through two main innovations: a learnable strand upsampling strategy that encodes 3D strands into multi-scale latent spaces, and a multi-scale adaptive conditioning mechanism using a transformer with diffusion heads to ensure consistency across granularity levels. Experiments on several benchmark datasets show our method outperforms existing approaches in realism and precision. Qualitative results further confirm its effectiveness. Code will be released at [GitHub](https://github.com/fighting-Zhang/StrandDesigner).
PDF63August 8, 2025