Optimal Codes for Deterministic Identification over Gaussian Channels: Closing the Capacity Gap

2026-04-13Information Theory

Information Theory
AI summary

The authors studied deterministic identification (DI), a type of communication that helps reliably recognize messages over large, noisy channels. They solved a long-standing problem for Gaussian channels by creating a code that perfectly matches the best possible performance limits, showing the DI capacity is exactly 1/2. They also showed their code achieves the best possible balance between message rate and error reliability. Additionally, the authors found a universal code that works well without needing to know the channel details beforehand.

deterministic identificationGaussian channelschannel capacityrate-reliability tradeoffcoding theoryerror decayuniversal codeslinearithmic capacity
Authors
Pau Colomer, Christian Deppe, Holger Boche, Andreas Winter
Abstract
Deterministic identification (DI) has emerged as a promising paradigm for large-scale and goal-oriented communication systems. Despite significant progress, a fundamental open problem has remained unresolved: a persistent gap between the best known lower and upper bounds on the DI capacity, as well as on the corresponding rate-reliability tradeoff bounds. In this paper, we finally close this gap for Gaussian channels $\mathcal{G}$ by constructing an optimised code that achieves the known upper bound. This allows us to establish that the linearithmic capacity for deterministic identification is $\dot{C}_{\text{DI}}(\mathcal{G})=\frac{1}{2}$. Furthermore, we analyse the rate-reliability tradeoff and show that the proposed scheme matches the known upper bounds to first order, thereby closing the existing gap in reliability performance for all admissible error decay regimes. Finally, we demonstrate the existence of an optimum universal code, which does not require knowledge of the channel parameters and yet achieves capacity.