# Interactive Channel Capacity

 Computer Science/Discrete Mathematics Seminar II Topic: Interactive Channel Capacity Speaker: Gillat Kol Affiliation: Member, School of Mathematics Date: Tuesday, November 19 Time/Room: 10:30am - 12:30pm/S-101 Video Link: https://video.ias.edu/csdm/2013/1119-GillatKol

In a profoundly influential 1948 paper, Claude Shannon defined the entropy function $H$, and showed that the capacity of a symmetric binary channel with noise rate (bit flip rate) $\mathrm{eps}$ is $1-H(\mathrm{eps})$. This means that one can reliably communicate $n$ bits by sending roughly $n / (1-H(\mathrm{eps}))$ bits over this channel. The extensive study of interactive communication protocols in the last decades gives rise to the related question of finding the capacity of a noisy channel when it is used interactively. We define interactive channel capacity as the minimal ratio between the communication required to compute a function (over a non-noisy channel), and the communication required to compute the same function over the $\mathrm{eps}$-noisy channel. We show that the interactive channel capacity is roughly $1-\Theta\left( \sqrt{H(\mathrm{eps})} \right)$. Our result gives the first separation between interactive and non-interactive channel capacity. Joint work with Ran Raz.