DIMACS Theoretical Computer Science Seminar


Title: Interactive Channel Capacity

Speaker: Gillat Kol, IAS

Date: Wednesday, November 20, 2013 11:00-12:00pm

Location: DIMACS Center, CoRE Bldg, Room 431, Rutgers University, Busch Campus, Piscataway, NJ


Abstract:

In a profoundly influential 1948 paper, Claude Shannon defined the entropy function H, and showed that the capacity of a symmetric binary channel with noise rate (bit flip rate) eps is 1-H(eps). This means that one can reliably communicate n bits by sending roughly n / (1-H(eps)) bits over this channel.

The extensive study of interactive communication protocols in the last decades gives rise to the related question of finding the capacity of a noisy channel when it is used interactively. We define interactive channel capacity as the minimal ratio between the communication required to compute a function (over a non-noisy channel), and the communication required to compute the same function over the eps-noisy channel. We show that the interactive channel capacity is roughly 1-Theta(sqrt(H(eps)) ). Our result gives the first separation between interactive and non-interactive channel capacity.

Joint work with Ran Raz.

See: http://www.math.rutgers.edu/~sk1233/theory-seminar/F13/