Lower Bounds for Randomized Communication Complexity

Topic:Lower Bounds for Randomized Communication Complexity
Speaker:Mike Saks
Affiliation:Rutgers, The State University
Date:Monday, May 4
Time/Room:11:15am - 12:15pm/S-101

We prove lower bounds on the randomized two-party communication complexity of functions that arise from read-once boolean formulae. A read-once boolean formula F is a propositional formula in which each variable appears exactly once. Such a formula can be represented by a tree, where the leaves correspond to variables, and the internal nodes are labeled by binary connectives. Some important lower bounds in the communication complexity model, namely the DISJOINTNESS lower bound of Kalyasundaram and Schnitger, and the specific read-once formulae, such as DISJOINTNESS and TRIBES. In this paper we use information theory methods to prove lower bounds that hold for any read-once formula. Our lower bounds are of the form $n(F/c^{d(F)}$, where $n(F)$ is the number of variables and $d(F)$ the depth of the formula, and they are optimal up to the constant $c$ in the denominator. This result should also be compared with the result of Heiman, Newman and Wigderson that every read-once threshold function $f$ has randomized decision tree complexity at least $n(f)/2^{d(f)}$. A lower bound on communication complexity of $f^{\vee}$ or $f^{\wedge}$ gives the same lower bound on decision tree complexity for $f$, however, the implication goes only one way, since communication protocols for $f^{\vee}$ and $f^{\wedge}$ do not have to come from a decision tree algorithm for $f$, and can be much faster. (For example, $f^{\wedge}$ when $f=\tsc{AND}_n$ has randomized decision tree complexity $\Theta(n)$ but communication complexity 1.) Thus, our result can be viewed as extending decision tree lower bound (though with a worse constant in the base of the exponent). This is joint work with Nikos Leonardos