Neuro-Symbolic ODE Discovery with Latent Grammar Flow

2026-04-17Machine Learning

Machine LearningArtificial IntelligenceComputational Engineering, Finance, and ScienceSymbolic Computation
AI summary

The authors created a method called Latent Grammar Flow (LGF) to help find mathematical equations that describe how things change over time, like in physics or biology. Their method turns equations into a special coded form and arranges them so similar equations are close together, making it easier to explore options. Then, LGF uses this setup to smartly generate and test possible equations that match the data. They also show that it can use extra knowledge, like rules about stability, to improve the search.

ordinary differential equationslatent spacegrammar-based representationneuro-symbolic methodsgenerative modeldiscrete flow modelbehavioural losssymbolic regressionstability constraintsdomain knowledge
Authors
Karin Yu, Eleni Chatzi, Georgios Kissas
Abstract
Understanding natural and engineered systems often relies on symbolic formulations, such as differential equations, which provide interpretability and transferability beyond black-box models. We introduce Latent Grammar Flow (LGF), a neuro-symbolic generative framework for discovering ordinary differential equations from data. LGF embeds equations as grammar-based representations into a discrete latent space and forces semantically similar equations to be positioned closer together with a behavioural loss. Then, a discrete flow model guides the sampling process to recursively generate candidate equations that best fit the observed data. Domain knowledge and constraints, such as stability, can be either embedded into the rules or used as conditional predictors.