Recognition: no theorem link
An Improved Lower Bound on Support Size of Capacity-Achieving Inputs for the Binomial Channel: Extended version
Pith reviewed 2026-05-13 03:07 UTC · model grok-4.3
The pith
The capacity-achieving input distribution for the binomial channel must have support size at least order sqrt(n log log n).
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
we derive a new lower bound on the support size of order sqrt(n log log n), up to explicit constants.
Load-bearing premise
the Beta-binomial output distribution induced by the reference input X_r ~ Beta(1/2,1/2) is asymptotically optimal in chi^2 divergence after the comparison step.
read the original abstract
We study the binomial channel and the structure of its capacity-achieving input and output distributions. It is known that the capacity-achieving input distribution is discrete and supported on finitely many points. The best previously known bounds show that the support size of the capacity-achieving distribution is lower-bounded by a term of order $\sqrt n$ and upper-bounded by a term of order $n/2$, where $n$ is the number of trials. In this work, we derive a new lower bound on the support size of order $\sqrt{n\log\log n}$, up to explicit constants. The proof consists of three main steps. First, we derive new upper and lower bounds on the capacity with a gap that vanishes as $n\to\infty$, which yields $C(n)=\frac12\log\frac{n\pi}{2e}+o(1)$. Second, we show that the Beta-binomial output distribution induced by the reference input $X_r\sim\mathrm{Beta}(1/2,1/2)$ is asymptotically optimal: it approaches the capacity-achieving output distribution in relative entropy and, after a comparison step, in $\chi^2$ divergence. Third, we prove a quantitative $\chi^2$ approximation lower bound showing that this Beta-binomial output cannot be approximated too well by the output induced by a $K$-point input. Combining these ingredients forces the capacity-achieving input distribution to have at least order $\sqrt{n\log\log n}$ mass points.
Editorial analysis
A structured set of objections, weighed in public.
Axiom & Free-Parameter Ledger
axioms (1)
- domain assumption The binomial channel consists of n independent trials with success probability equal to the input value x in [0,1].
Reference graph
Works this paper leans on
-
[1]
Capacities and optimal input distributions for particle-intensity channels,
N. Farsad, W. Chuang, A. Goldsmith, C. Komninakis, M. Médard, C. Rose, L. Vandenberghe, E. E. Wesel, and R. D. Wesel, “Capacities and optimal input distributions for particle-intensity channels,”IEEE Transactions on Molecular, Biological and Multi-Scale Communications, vol. 6, no. 3, pp. 220–232, 2020
work page 2020
-
[2]
A. Einolghozati, M. Sardari, and F. Fekri, “Design and analysis of wire- less communication systems using diffusion-based molecular communi- cation among bacteria,”IEEE Transactions on Wireless Communications, vol. 12, no. 12, pp. 6096–6105, 2013
work page 2013
-
[3]
Capacity of molecu- lar channels with imperfect particle-intensity modulation and detection,
N. Farsad, C. Rose, M. Médard, and A. Goldsmith, “Capacity of molecu- lar channels with imperfect particle-intensity modulation and detection,” in2017 IEEE International Symposium on Information Theory (ISIT). IEEE, 2017, pp. 2468–2472
work page 2017
-
[4]
Channel modeling for diffusive molecular communication—A tutorial review,
V . Jamali, A. Ahmadzadeh, W. Wicke, A. Noel, and R. Schober, “Channel modeling for diffusive molecular communication—A tutorial review,” Proceedings of the IEEE, vol. 107, no. 7, pp. 1256–1301, 2019
work page 2019
-
[5]
Binary codes capable of correcting deletions, inser- tions, and reversals,
V . I. Levenshtein, “Binary codes capable of correcting deletions, inser- tions, and reversals,” inSoviet Physics Doklady, vol. 10, no. 8. Soviet Union, 1966, pp. 707–710
work page 1966
-
[6]
Capacity upper bounds for deletion-type channels,
M. Cheraghchi, “Capacity upper bounds for deletion-type channels,” Journal of the ACM (JACM), vol. 66, no. 2, pp. 1–79, 2019
work page 2019
-
[7]
Capacity of the binomial channel, or minimax redundancy for memoryless sources,
C. Komninakis, L. Vandenberghe, and R. D. Wesel, “Capacity of the binomial channel, or minimax redundancy for memoryless sources,” in IEEE International Symposium on Information Theory, 2001, pp. 127– 127
work page 2001
-
[8]
Minimax redundancy for the class of memoryless sources,
Q. Xie and A. R. Barron, “Minimax redundancy for the class of memoryless sources,”IEEE Trans. Inf. Theory, vol. 43, no. 2, pp. 646– 657, 1997
work page 1997
-
[9]
Some aspects of convexity useful in information theory,
H. Witsenhausen, “Some aspects of convexity useful in information theory,”IEEE Trans. Inf. Theory, vol. 26, no. 3, pp. 265–271, 1980
work page 1980
-
[10]
Binomial channel: On the capacity-achieving distribution and bounds on the capacity,
I. Zieder, A. Favano, L. Barletta, and A. Dytso, “Binomial channel: On the capacity-achieving distribution and bounds on the capacity,” in Proceedings of the IEEE International Symposium on Information Theory (ISIT), 2024, pp. 711–716
work page 2024
-
[11]
An improved lower bound on cardinality of support of the amplitude-constrained AWGN channel,
H. Wang, L. Barletta, and A. Dytso, “An improved lower bound on cardinality of support of the amplitude-constrained AWGN channel,”
-
[12]
Available: https://arxiv.org/abs/2512.22691
[Online]. Available: https://arxiv.org/abs/2512.22691
-
[13]
On the best approximation by finite Gaussian mixtures,
Y . Ma, Y . Wu, and P. Yang, “On the best approximation by finite Gaussian mixtures,”IEEE Trans. Inf. Theory, vol. 71, no. 7, pp. 5469–5492, 2025
work page 2025
-
[14]
When are discrete channel inputs optimal? — optimization techniques and some new results,
A. Dytso, M. Goldenbaum, H. V . Poor, and S. S. Shitz, “When are discrete channel inputs optimal? — optimization techniques and some new results,” in2018 52nd Annual Conference on Information Sciences and Systems (CISS), 2018, pp. 1–6
work page 2018
-
[15]
N. L. Johnson, A. W. Kemp, and S. Kotz,Univariate Discrete Distribu- tions, 3rd ed. Wiley, 2005
work page 2005
-
[16]
I. Sason and S. Verdú, “f-divergence inequalities,”IEEE Trans. Inf. Theory, vol. 62, no. 11, pp. 5973–6006, 2016
work page 2016
-
[17]
Binomial channel: On the capacity-achieving distribution and bounds on the capacity,
I. Zieder, A. Favano, L. Barletta, and A. Dytso, “Binomial channel: On the capacity-achieving distribution and bounds on the capacity,”arXiv preprint arXiv:2401.12818, 2024
-
[18]
A generalization of the eckart–young–mirsky approximation theorem,
G. H. Golub, A. Hoffman, and G. W. Stewart, “A generalization of the eckart–young–mirsky approximation theorem,”Linear Algebra and its Applications, vol. 88/89, pp. 317–328, 1988
work page 1988
-
[19]
I. S. Gradshteyn and I. M. Ryzhik,Table of Integrals, Series, and Products, 7th ed., A. Jeffrey and D. Zwillinger, Eds. Amsterdam, The Netherlands: Academic Press, 2007
work page 2007
-
[20]
Some extensions of W. Gautschi’s inequalities for the gamma function,
D. Kershaw, “Some extensions of W. Gautschi’s inequalities for the gamma function,”Mathematics of Computation, vol. 41, no. 164, pp. 607–611, 1983
work page 1983
-
[21]
An information theoretical identity and a problem involving capacity,
F. Topsøe, “An information theoretical identity and a problem involving capacity,”Studia Scientiarum Mathematicarum Hungarica, vol. 2, no. 291-292, p. 246, 1967
work page 1967
-
[22]
I. Csiszár and J. Körner,Information Theory: Coding Theorems for Discrete Memoryless Systems. Cambridge University Press, 2011
work page 2011
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.