Title: On the Communication Complexity of Distributions
Speaker: Mario Szegedy, Rutgers University
Date: Wednesday, February 20, 2008 11:00-12:00pm
Location: DIMACS Center, CoRE Bldg, Room 431, Rutgers University, Busch Campus, Piscataway, NJ
Consider the following general communication problem: Alice and Bob have to simulate a probabilistic function F, that with every $(x,y)$ associates a probability distribution. The two parties, upon receiving inputs x and y need to output a and b respectively, such that the (a,b) pair is distributed according to F(x,y) (recall that for every x and y, F(x,y) is a distribution on pairs (a,b)). Alice and Bob share randomness (this is their only source of randomness), and have access to a channel that allows two-ways communication. Our main focus is an instance of the above problem coming from the well known EPR experiment in quantum physics, but we also present some more general facts about this rather young and promising complexity measure. The results contained herein are entirely classical and no knowledge of the quantum phenomenon is assumed.
Different notions of complexity may be defined for this problem. Due to an upper bound by Toner and Bacon, and a matching lower bound by Barrett, Kent and Pironio, the average and worst-case communication complexities are known to be both one bit. In this paper, we focus on the asymptotic communication complexity, for which only an upper bound of 0.85 bits was presented by Toner and Bacon. We show that the technique they use to compress the communication is limited by yet another notion of complexity, which we call the entropic communication, but that it is possible to reduce the communication even further to 0.28 bits. This is somewhat of a surprise, because some conjectured that the earlier bound was optimal. In our investigation we find interesting connections to a number of different problems in communication complexity.
Joint with Jeremie Roland (University of California, Berkeley)