DIMACS Theoretical Computer Science Seminar


Title: Exponential Separation of Information and Communication

Speaker: Gillat Kol, IAS

Date: Wednesday, April 15, 2015 11:00am-12:00pm

Location: CoRE Bldg, Room 301, Rutgers University, Busch Campus, Piscataway, NJ


Abstract:

In profoundly influential works, Shannon and Huffman show that if Alice wants to send a message X to Bob, it's sufficient for her to send roughly H(X) bits (in expectation), where H denotes Shannon's entropy function. In other words, the message X can be compressed to roughly H(X) bits, the information content of the message. Can one prove similar results in the interactive setting, where Alice and Bob engage in an interactive communication protocol?

We show the first gap between communication complexity and information complexity, by giving an explicit example of a boolean function with information complexity O(k), and distributional communication complexity > 2^k. This shows that a communication protocol cannot always be compressed to its internal information, answering (the standard formulation of) the above question in the negative. By a result of Braverman, our example gives the largest possible gap.

By a result of Braverman and Rao, our example gives the first gap between communication complexity and amortized communication complexity, implying that strong direct sum does not hold for distributional communication complexity, answering a long standing open problem.

Joint work with Anat Ganor and Ran Raz.

See: http://www.math.rutgers.edu/~sk1233/theory-seminar/S15/