One-Step Graph-Structured Neural Flows for Irregular Multivariate Time Series Classification
2026-05-11 • Machine Learning
Machine LearningArtificial Intelligence
AI summaryⓘ
The authors focus on improving Neural Flows, which are models that predict complex time series data by learning continuous paths instead of step-by-step calculations. They note that previous methods often ignore how different variables affect each other and that their one-step approach makes learning these interactions hard. To fix this, they introduce Graph-Structured Neural Flows (GSNF), which use two special techniques to better capture interactions: one creates different starting points to reveal relationships, and the other checks that the model’s predictions work well forwards and backwards in time. Their approach performs very well on real-world data while keeping training efficient.
Neural FlowsOrdinary Differential Equations (ODE)Multivariate Time SeriesGraph Neural NetworksSelf-supervisionTrajectory LearningInvertible FlowsForward-backward ConsistencyInteraction ModelingRe-initialization
Authors
Mengzhou Gao, Kaiwei Wang, Pengfei Jiao
Abstract
Neural Flows efficiently model irregular multivariate time series by directly learning ODE solution trajectories with neural networks, bypassing step-by-step numerical solvers. Despite their efficiency, many existing approaches treat variables independently, leaving inter-variable interactions underexplored. Moreover, their one-step mapping makes interaction modeling inherently challenging, as it removes the iterative refinement of interactions during learning. To address this challenge, we propose one-step Graph-Structured Neural Flows (GSNF), which introduce two auxiliary-trajectory self-supervision strategies to strengthen interaction learning: (i) interaction-aware trajectory generation via re-initialization, which induces trajectory divergence to expose graph-induced interactions, with a theoretically derived lower bound on divergence; and (ii) reverse-time trajectory generation, which enforces forward-backward consistency to regularize graph learning, enabled by flow invertibility. Experiments on five real-world datasets show that GSNF achieves state-of-the-art classification performance with highly competitive training time and memory usage.