Solving complex partial differential equations (PDEs) is crucial for modeling physical systems, but traditional numerical methods falter in high-dimensional or multi-scale scenarios. While neural operators, particularly those based on Transformers, have shown promise as data-driven alternatives, their uniform treatment of spatial data leads to inefficient global attention. This oversight applies costly computations to smooth, large-scale dynamics and high-frequency fluctuations alike. To overcome this, researchers have introduced DynFormer, a novel dynamics-informed neural operator designed to tackle these challenges by explicitly accounting for scale separation in physical fields.
Rethinking Transformers for Physical Dynamics
DynFormer reimagines the Transformer architecture through the lens of complex dynamics. Instead of a one-size-fits-all attention mechanism, it assigns specialized modules to different physical scales. A key innovation is the use of Spectral Embedding to isolate low-frequency modes. This allows a Kronecker-structured attention mechanism to efficiently capture large-scale global interactions with substantially reduced complexity. For the small-scale, fast-varying turbulent cascades that are intrinsically linked to macroscopic states, DynFormer employs a Local-Global-Mixing transformation. This module uses nonlinear multiplicative frequency mixing to implicitly reconstruct these fine-grained details without the computational burden of global attention. The integration of these components into a hybrid evolutionary architecture also ensures robust long-term temporal stability, a critical factor for accurate simulations.