优化任意拓扑:一种形状与分辨率无关的结构拓扑优化基础模型
Optimize Any Topology: A Foundation Model for Shape- and Resolution-Free Structural Topology Optimization
October 26, 2025
作者: Amin Heyrani Nobari, Lyle Regenwetter, Cyril Picard, Ligong Han, Faez Ahmed
cs.AI
摘要
结构拓扑优化(TO)是工程设计的核心环节,但由于复杂的物理场和硬约束条件,其计算强度始终居高不下。现有深度学习方法受限于固定方形网格、少量人工编码的边界条件以及事后优化模式,难以实现广泛部署。我们提出"任意拓扑优化"(OAT)基础模型框架,能够直接预测任意长宽比、分辨率、体积分数、载荷及固定约束条件下的最小柔度构型。OAT融合了分辨率无关/形状无关的自编码器、隐式神经场解码器,以及基于OpenTO数据集(包含220万个优化结构、覆盖200万种独特边界条件配置的新语料库)训练的隐空间条件扩散模型。在四个公开基准和两项高难度未知测试中,OAT相较于现有最优模型将平均柔度降低达90%,并在单GPU上实现64×64至256×256分辨率、最高10:1长宽比范围内的亚秒级推理。这些成果确立了OAT作为物理感知拓扑优化的通用、快速且分辨率无关的框架,并提供了大规模数据集以推动逆向设计中生成模型的进一步发展。代码与数据详见:https://github.com/ahnobari/OptimizeAnyTopology。
English
Structural topology optimization (TO) is central to engineering design but
remains computationally intensive due to complex physics and hard constraints.
Existing deep-learning methods are limited to fixed square grids, a few
hand-coded boundary conditions, and post-hoc optimization, preventing general
deployment. We introduce Optimize Any Topology (OAT), a foundation-model
framework that directly predicts minimum-compliance layouts for arbitrary
aspect ratios, resolutions, volume fractions, loads, and fixtures. OAT combines
a resolution- and shape-agnostic autoencoder with an implicit neural-field
decoder and a conditional latent-diffusion model trained on OpenTO, a new
corpus of 2.2 million optimized structures covering 2 million unique
boundary-condition configurations. On four public benchmarks and two
challenging unseen tests, OAT lowers mean compliance up to 90% relative to the
best prior models and delivers sub-1 second inference on a single GPU across
resolutions from 64 x 64 to 256 x 256 and aspect ratios as high as 10:1. These
results establish OAT as a general, fast, and resolution-free framework for
physics-aware topology optimization and provide a large-scale dataset to spur
further research in generative modeling for inverse design. Code & data can be
found at https://github.com/ahnobari/OptimizeAnyTopology.