爱因斯坦场:计算广义相对论的神经视角
Einstein Fields: A Neural Perspective To Computational General Relativity
July 15, 2025
作者: Sandeep Suresh Cranganore, Andrei Bodnar, Arturs Berzins, Johannes Brandstetter
cs.AI
摘要
我们提出了爱因斯坦场(Einstein Fields),这是一种旨在将计算密集型的四维数值相对论模拟压缩为紧凑的隐式神经网络权重的神经表示方法。通过建模广义相对论的核心张量场——度规,爱因斯坦场能够借助自动微分推导出物理量。然而,与传统神经场(如符号距离场、占据场或辐射场)不同,爱因斯坦场属于神经张量场,其关键区别在于,当将广义相对论的时空几何编码为神经场表示时,动力学特性会自然作为副产品涌现。爱因斯坦场展现出显著潜力,包括对四维时空的连续建模、网格无关性、存储效率、导数精度以及易用性。我们在多个广义相对论的经典测试平台上应对这些挑战,并发布了一个基于JAX的开源库,为数值相对论迈向更具可扩展性和表现力的方法铺平道路。代码已公开于https://github.com/AndreiB137/EinFields。
English
We introduce Einstein Fields, a neural representation that is designed to
compress computationally intensive four-dimensional numerical relativity
simulations into compact implicit neural network weights. By modeling the
metric, which is the core tensor field of general relativity, Einstein
Fields enable the derivation of physical quantities via automatic
differentiation. However, unlike conventional neural fields (e.g., signed
distance, occupancy, or radiance fields), Einstein Fields are Neural
Tensor Fields with the key difference that when encoding the spacetime
geometry of general relativity into neural field representations, dynamics
emerge naturally as a byproduct. Einstein Fields show remarkable potential,
including continuum modeling of 4D spacetime, mesh-agnosticity, storage
efficiency, derivative accuracy, and ease of use. We address these challenges
across several canonical test beds of general relativity and release an open
source JAX-based library, paving the way for more scalable and expressive
approaches to numerical relativity. Code is made available at
https://github.com/AndreiB137/EinFields