A Closed-Form Adaptive-Landmark Kernel for Certified Point-Cloud and Graph Classification
2026-05-05 • Machine Learning
Machine Learning
AI summaryⓘ
The authors present PALACE, a method that improves classification by adaptively choosing landmarks in data using a small validation process. They provide theoretical guarantees about its performance and show how it optimizes weights and positions without extensive training. PALACE also offers reliable confidence measures for each prediction and performs very well compared to other methods on chemical graph datasets. Their experiments demonstrate that PALACE maintains strong accuracy even when the problem size increases significantly, unlike simpler uniform approaches.
topological data analysispersistence diagramslandmark selectionkernel methodsclassification accuracycross-validationcover theoryRKHSmargin boundsadaptive sampling
Authors
Sushovan Majhi, Atish Mitra, Žiga Virk, Pramita Bagchi
Abstract
We introduce PALACE (Persistence Adaptive-Landmark Analytic Classification Engine), the data-adaptive companion to PLACE, paying a small cross-validation tier on three knobs (budget, radii, bandwidth; $\leq 5$ choices each). A cover-theoretic core (Lebesgue-number criterion on the landmark cover) yields four closed-form guarantees. (i) A structural lower distortion bound $λ(τ;ν)$ on $\mathcal{D}_n$ under cross-diagram non-interference, with a $(D/L)^2$ budget reduction over the uniform grid when diagrams concentrate. (ii) Equal weights $w_k = K^{-1/2}$ maximizing $λ$, and farthest-point-sampling positions $2$-approximating the optimal $k$-center covering radius; both derived from training labels alone, no gradient training. (iii) A kernel-RKHS classification rate $O((k-1)\sqrt{K}/(γ\sqrt{m_{\min}}))$ with binary necessity threshold $m = Ω(\sqrt K/γ)$ from a matching Le Cam lower bound, and a closed-form filtration-selection rule. The kernel-Mahalanobis margin $\hatρ_{\mathrm{Mah}}$ is the strongest closed-form ranker across the chemical-graph pool (mean Spearman $ρ\approx +0.60$); the isotropic surrogate $\hatγ/\sqrt{K}$ admits a selection-consistency rate, and $\widehatλ$ from (i) provides an independent data-level signal (positive on COX2 and PTC). (iv) A per-prediction certificate, in non-asymptotic Pinelis and asymptotic Gaussian forms, with no calibration split. Empirically, PALACE is the strongest closed-form diagram-based method on Orbit5k ($91.3 \pm 1.0\%$, matching Persformer), leads every diagram-based competitor on COX2 and MUTAG, and is competitive on DHFR (within 1 pp of ECP). At $8\times$ domain inflation, adaptive placement maintains $94\%$ while the uniform grid collapses to chance ($25\%$ on 4-class data).