Stable Long-Horizon PDE Forecasting via Latent Structured Spectral Propagators

2026-05-11Machine Learning

Machine Learning
AI summary

The authors address the challenge of predicting how physical systems described by partial differential equations (PDEs) change over long periods. They create a new neural network method that breaks down the prediction process into steps: analyzing, simplifying, and forecasting the key dynamics in a special frequency-based space. This helps reduce errors that usually build up in long forecasts and keeps the predictions stable over time. Their method performs much better than existing approaches in tests.

partial differential equationsneural operatorstime-series forecastingautoregressive modelsspectral methodslatent spacemodal evolutioninductive biasfrequency conditioning
Authors
Xiaoxiao Lu, Ye Yuan, Jiahao Shi
Abstract
Long-horizon forecasting of time-dependent partial differential equations (PDEs) is critical for characterizing the sustained evolution of physical systems. While neural operators have emerged as efficient surrogates, they typically learn implicit finite-time transitions from discrete observations. When deployed autoregressively, such propagators often suffer from rapid error accumulation and dynamic drift. To address this, we propose a neural forecasting framework that reformulates PDE rollout as learning a Structured Spectral Propagator (SSP) in a propagation-oriented latent space. Following an analysis-propagation-synthesis design, our framework: (i) maps physical states into a shared, time-consistent spatial representation; (ii) projects this space into a compact propagation state to isolate recurrent dynamics from fine-grained spatial details, thereby decoupling reconstruction fidelity from rollout regularity; and (iii) evolves retained spectral modes using a frequency-conditioned linear backbone complemented by a nonlinear spectral closure to account for truncated interactions. This explicit structuring endows the propagator with a strong inductive bias for coherent modal evolution. Extensive experiments demonstrate that SSP significantly outperforms state-of-the-art baselines, reducing relative $L_2$ errors by up to 48.9% and exhibiting improved stability in temporal extrapolation beyond the supervised horizon.