愛因斯坦場:從神經網絡視角看計算廣義相對論
Einstein Fields: A Neural Perspective To Computational General Relativity
July 15, 2025
作者: Sandeep Suresh Cranganore, Andrei Bodnar, Arturs Berzins, Johannes Brandstetter
cs.AI
摘要
我們提出了愛因斯坦場(Einstein Fields),這是一種神經表示方法,旨在將計算密集型的四維數值相對論模擬壓縮為緊湊的隱式神經網絡權重。通過建模度量張量——廣義相對論的核心張量場,愛因斯坦場使得物理量能夠通過自動微分推導出來。然而,與傳統的神經場(如符號距離場、佔用場或輻射場)不同,愛因斯坦場是神經張量場,其關鍵區別在於,當將廣義相對論的時空幾何編碼為神經場表示時,動力學作為副產物自然湧現。愛因斯坦場展現出顯著的潛力,包括四維時空的連續體建模、與網格無關性、存儲效率、導數精度以及易用性。我們在廣義相對論的幾個經典測試平臺上應對了這些挑戰,並發布了一個基於JAX的開源庫,為數值相對論的可擴展性和表達性方法鋪平了道路。代碼可在https://github.com/AndreiB137/EinFields獲取。
English
We introduce Einstein Fields, a neural representation that is designed to
compress computationally intensive four-dimensional numerical relativity
simulations into compact implicit neural network weights. By modeling the
metric, which is the core tensor field of general relativity, Einstein
Fields enable the derivation of physical quantities via automatic
differentiation. However, unlike conventional neural fields (e.g., signed
distance, occupancy, or radiance fields), Einstein Fields are Neural
Tensor Fields with the key difference that when encoding the spacetime
geometry of general relativity into neural field representations, dynamics
emerge naturally as a byproduct. Einstein Fields show remarkable potential,
including continuum modeling of 4D spacetime, mesh-agnosticity, storage
efficiency, derivative accuracy, and ease of use. We address these challenges
across several canonical test beds of general relativity and release an open
source JAX-based library, paving the way for more scalable and expressive
approaches to numerical relativity. Code is made available at
https://github.com/AndreiB137/EinFields