ChatPaper.aiChatPaper

EquiformerV2:改良的等變壓縮器,可擴展至更高次表示

EquiformerV2: Improved Equivariant Transformer for Scaling to Higher-Degree Representations

June 21, 2023
作者: Yi-Lun Liao, Brandon Wood, Abhishek Das, Tess Smidt
cs.AI

摘要

Equivariant Transformers(等變換器)如Equiformer已證明將Transformer應用於3D原子系統領域的有效性。然而,由於其計算複雜度,它們仍然受限於小範圍的等變表示。在本文中,我們探討這些架構是否能夠良好擴展至更高程度。從Equiformer開始,我們首先將SO(3)卷積替換為eSCN卷積,以有效地整合更高階的張量。然後,為了更好地利用更高程度的能力,我們提出三項架構改進 -- 注意力重正化、可分離S^2激活和可分離層規範化。將所有這些結合在一起,我們提出EquiformerV2,在大規模OC20數據集上比以往最先進的方法在力量上提高了高達12%,在能量上提高了4%,提供更好的速度-精度折衷,並且減少了2倍的DFT計算,用於計算吸附能。
English
Equivariant Transformers such as Equiformer have demonstrated the efficacy of applying Transformers to the domain of 3D atomistic systems. However, they are still limited to small degrees of equivariant representations due to their computational complexity. In this paper, we investigate whether these architectures can scale well to higher degrees. Starting from Equiformer, we first replace SO(3) convolutions with eSCN convolutions to efficiently incorporate higher-degree tensors. Then, to better leverage the power of higher degrees, we propose three architectural improvements -- attention re-normalization, separable S^2 activation and separable layer normalization. Putting this all together, we propose EquiformerV2, which outperforms previous state-of-the-art methods on the large-scale OC20 dataset by up to 12% on forces, 4% on energies, offers better speed-accuracy trade-offs, and 2times reduction in DFT calculations needed for computing adsorption energies.
PDF50December 15, 2024