Fix the Loss, Not the Radius: Rethinking the Adversarial Perturbation of Sharpness-Aware Minimization
2026-05-11 • Machine Learning
Machine Learning
AI summaryⓘ
The authors looked at a technique called Sharpness-Aware Minimization (SAM), which helps models generalize better by focusing on the worst loss nearby the current model. They noticed that SAM uses a simple, first-order approach, but the idea of flat minima is more about second-order information like curvature. To fix this, they created Loss-Equated SAM (LE-SAM), which changes the way perturbations are done to focus more on curvature and less on just gradients. Their tests show that LE-SAM outperforms SAM and its variants on many tasks.
Sharpness-Aware Minimizationgeneralizationloss landscapegradientcurvatureflat minimafirst-order methodssecond-order methodsoptimizationperturbation radius
Authors
Jinping Wang, Qinhan Liu, Zhiwu Xie, Zhiqiang Gao
Abstract
Sharpness-Aware Minimization (SAM) improves generalization by minimizing the worst-case loss within a fixed parameter-space radius neighborhood. SAM and its variants mainly rely on a first-order linearized surrogate, while flat minima are inherently a second-order (curvature) notion.We revisit this mismatch and propose Loss-Equated SAM (LE-SAM), which inverts the traditional SAM mechanism that fixed perturbation radius with a fixed loss-space budget,effectively removing gradient-norm-dominated learning signals and shifting optimization toward curvature-dominated terms. Extensive experiments across diverse benchmarks and tasks demonstrate the strong generalization ability of LESAM that consistently outperforms SAM and even its variants, achieving the state-of-the-art performance.