On the Limits of Learning a Discrete Memoryless Communication Channel

Author/Creator ORCID

Date

2023-01-24

Department

Program

Citation of Original Publication

M. A. Tope and J. M. Morris, "On the Limits of Learning a Discrete Memoryless Communication Channel," MILCOM 2022 - 2022 IEEE Military Communications Conference (MILCOM), Rockville, MD, USA, 2022, pp. 222-228, doi: 10.1109/MILCOM55135.2022.10017597.

Rights

© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Subjects

Abstract

This paper explores bounds on the minimum number of channel probes required to learn sufficient information to establish reliable communication at a given communication rate (below channel capacity) for a discrete memoryless channel. Given a set of discrete channel input-output sample pairs (where for each discrete input value, the associated aggregate set of discrete output values observations is multinomially distributed), we leverage a non-asymptotic probably approximately correct (PAC) bound on the mutual information (channel capacity) between the label (discrete) random variable (RV) and the observation RV to establish a convergence rate for the worst case channel. Previous bounds (such as those based on Sanov’s Theorem) provide high probability (i.e. PAC) bounds on the true mutual information that converge with rate O(log(N)/ √ N), where N is the number of independent and identically distributed (i.i.d.) samples used to compute the empirical probability mass functions. Using an improved PAC sublevel-set bound, we sharpen the rate of convergence to O( p log(log(N)) log(N)/N).