The Exponentially Weighted Signature
2026-03-19 • Machine Learning
Machine Learning
AI summaryⓘ
The authors discuss a way to better capture the history of complex paths by introducing the Exponentially Weighted Signature (EWS). Unlike previous methods that treat all past information the same, their method adjusts how past data is remembered using more flexible mathematical tools. This approach keeps useful properties of classic methods but allows for more complex memory behaviors like oscillations or growth. They also show their method connects to well-known models and transforms, can be computed efficiently, and performs better in some prediction tasks involving randomness.
signature of a pathExponentially Fading Memory (EFM)bounded linear operatorstensor algebracontrolled differential equationsstate-space modelsLaplace transformFourier transformstochastic differential equations (SDEs)regression tasks
Authors
Alexandre Bloch, Samuel N. Cohen, Terry Lyons, Joël Mouterde, Benjamin Walker
Abstract
The signature is a canonical representation of a multidimensional path over an interval. However, it treats all historical information uniformly, offering no intrinsic mechanism for contextualising the relevance of the past. To address this, we introduce the Exponentially Weighted Signature (EWS), generalising the Exponentially Fading Memory (EFM) signature from diagonal to general bounded linear operators. These operators enable cross-channel coupling at the level of temporal weighting together with richer memory dynamics including oscillatory, growth, and regime-dependent behaviour, while preserving the algebraic strengths of the classical signature. We show that the EWS is the unique solution to a linear controlled differential equation on the tensor algebra, and that it generalises both state-space models and the Laplace and Fourier transforms of the path. The group-like structure of the EWS enables efficient computation and makes the framework amenable to gradient-based learning, with the full semigroup action parametrised by and learned through its generator. We use this framework to empirically demonstrate the expressivity gap between the EWS and both the signature and EFM on two SDE-based regression tasks.