ChatPaper.aiChatPaper

SANeRF-HQ:在高品質下對任何東西進行 NeRF 分割

SANeRF-HQ: Segment Anything for NeRF in High Quality

December 3, 2023
作者: Yichen Liu, Benran Hu, Chi-Keung Tang, Yu-Wing Tai
cs.AI

摘要

最近,Segment Anything Model(SAM)展示了零樣本分割的卓越能力,而NeRF(神經輻射場)則作為一種超越新視角合成的各種3D問題的方法而變得流行。儘管已經有初步嘗試將這兩種方法納入3D分割中,但它們面臨著在複雜情境中準確且一致地分割物體的挑戰。本文介紹了高質量的Segment Anything for NeRF(SANeRF-HQ),以實現對給定場景中任何物體的高質量3D分割。SANeRF-HQ利用SAM進行開放世界物體分割,並由用戶提供的提示進行引導,同時利用NeRF從不同視角聚合信息。為了克服上述挑戰,我們採用密度場和RGB相似性來增強聚合過程中分割邊界的準確性。著重於分割準確性,我們在多個NeRF數據集上對我們的方法進行定量評估,其中提供了高質量的地面真實數據或手動標註。SANeRF-HQ在NeRF物體分割方面顯示出顯著的質量改進,為物體定位提供了更高的靈活性,並實現了在多個視角下更一致的物體分割。更多信息可在https://lyclyc52.github.io/SANeRF-HQ/找到。
English
Recently, the Segment Anything Model (SAM) has showcased remarkable capabilities of zero-shot segmentation, while NeRF (Neural Radiance Fields) has gained popularity as a method for various 3D problems beyond novel view synthesis. Though there exist initial attempts to incorporate these two methods into 3D segmentation, they face the challenge of accurately and consistently segmenting objects in complex scenarios. In this paper, we introduce the Segment Anything for NeRF in High Quality (SANeRF-HQ) to achieve high quality 3D segmentation of any object in a given scene. SANeRF-HQ utilizes SAM for open-world object segmentation guided by user-supplied prompts, while leveraging NeRF to aggregate information from different viewpoints. To overcome the aforementioned challenges, we employ density field and RGB similarity to enhance the accuracy of segmentation boundary during the aggregation. Emphasizing on segmentation accuracy, we evaluate our method quantitatively on multiple NeRF datasets where high-quality ground-truths are available or manually annotated. SANeRF-HQ shows a significant quality improvement over previous state-of-the-art methods in NeRF object segmentation, provides higher flexibility for object localization, and enables more consistent object segmentation across multiple views. Additional information can be found at https://lyclyc52.github.io/SANeRF-HQ/.
PDF81December 15, 2024