# Variance/Entropy Decomposition Techniques for Proving Fast Mixing of the Glauber Dynamics

 COMPUTER SCIENCE/DISCRETE MATH SEMINAR, II Topic: Variance/Entropy Decomposition Techniques for Proving Fast Mixing of the Glauber Dynamics Speaker: Dror Weitz Affiliation: IAS Date: Tuesday, December 14 Time/Room: 10:30am - 12:30pm/S-101

The Glauber dynamics is a simple Markov chain algorithm for sampling from distributions that arise in models from statistical physics. In each step of this dynamics the value (or spin) of a random site is updated according to some rule which is "consistent" with the equilibrium distribution we wish to sample from. While this dynamics converges to the equilibrium distribution by construction, an important task is to determine the mixing time of the dynamics, i.e., the number of steps needed to get close to the stationary distribution, starting from an arbitrary state. It is widely believed that the mixing time of the Glauber dynamics is closely related to how close the stationary distribution is to being product, and this is formally known in some restricted scenarios. In particular, a well-known result in this field is that for models on the integer lattice Z^d, the mixing time is O(n log n) if (and only if) the stationary distribution exhibits a strong form of decay of correlations. While Stroock and Zegarlinski were the first the prove the above result in '92, a number of proof followed since, and probably the most elegant was given by Cesi a few years ago. The latter proof is based on the fact the spectral gap of the Glauber dynamics expresses how well can the variance of a function w.r.t. the stationary distribution can be decomposed into the sum of the local conditional variances in each site. (In fact, Cesi considers the log-Sobolev constant of the dynamics, where the decomposition is of entropy rather than variance). As Cesi shows, this decomposition can be done by losing only a constant factor if the stationary distribution is close to being product (as is the case when the strong decay of correlations holds). In a paper from last year with Alistair Sincair and Fabio Martinelli, we used such decomposition ideas to show that a weaker notion of decay of correlations is enough to get rapid mixing for models on trees, and this allowed us to get new rapid mixing results in many interesting cases, including the first results that depend on the boundary condition. In this talk, after introducing and defining the models we work with and the Glauber dynamics, I will explain how the spectral gap is related to the above decomposition of variance, and how decompositions of this kind are related to the stationary distribution being close to product. I will then exemplify the usefulness of this characterization of the spectral gap by using it to prove a few simple results (e.g., that the mixing time for models on trees is always polynomial in the number of sites). We should then be ready to go through Cesi's proof for models on Z^d. Finally (I except the last part to require another session), I will go over our proof that for systems on trees, decay of correlations implies O(n log n) mixing time. I would like to point out that some of the material I will discuss in the talk can be of general interest (beside the context of the Glauber dynamics), in particular, a main theme will be discussing and comparing different notions that measure how close a distribution is to being product and how this can be used to decompose variance/entropy w.r.t. this distribution.