Markov Chain Monte Carlo (Q3, spring 2011) – Course Webpage

1. The
7^{th} and last lecture will take place in INCUBA CENTER room **121**
on Friday 18/3/11, 12:15-15:00. Note the unusual day **and the unusual room**!

2. The project page is now up, including a list of “recommendation” to choose from, and some more details.

Markov Chain Monte Carlo is a family of algorithms to approximately count combinatorial objects (colorings of graphs, contingency tables, etc), approximately measuring continuous objects (e.g. estimating the volume of convex bodies), and similar tasks. They are based on approximately-uniform sampling algorithms, which in turn are based on designing a suitable Markov chain and analyzing it. The course will deal with methods for proving rapid convergence of Markov chains (such methods include: coupling, conductance, canonical paths), as well as the relevance of specific classes of Markov chains, and how to use approximate sampling to get approximate counting. The course will to a large extent be based on Eric Vigoda's course on the subject.

This course in the Aarhus course catalog: here

** **

Elad Verbin. (With help from Peter Bro Miltersen.)

Room: 5523.131, INCUBA SCIENCE PARK

Thursdays, 12-15

** **

See the project webpage.

** **

The course has no single source text, as the lectures are drawn from multiple sources. Here are some materials in the subject, which are more or less in the same vein as the course:

· Book drafts by Jerrum: at Jerrum’s publications page (scroll down)

· Book by Aldous: Reversible Markov Chains and Random Walks on Graphs

· Book by Levin, Peres and Wilmer: Markov Chains and Mixing Times

· A Survey on the use of Markov Chains to Randomly Sample Colorings by A. Frieze and E. Vigoda

*·**
*Geometric
Random Walks: A Survey,
by *Santosh Vempala*

· Rapidly mixing Markov chains with applications in computer science and physics. By Dana Randall (2006)

__With Lecture Notes:__

__Without lecture notes (but with some other useful
material):__

__Courses on related subjects:__

· Another Dana Randall Course (with lecture notes)

· Luca Trevisan’s Course (with great lecture notes!)

** **

__Lecture 1:__

Introduction (this lecture covers a lot but the treatment is less formal)

Ergodicity (only statements, not proofs)

Sampling a k-coloring of a graphs, for k>=2*Delta+1 (Delta is the maximum degree): the Glauber Dynamics; proving Rapid Mixing using Coupling and Path Coupling.

Connection between approximate sampling and approximate counting

Source: Vigoda lecture notes (lectures 4-6,8-9)

__Lecture 2:__

A More formal treatment: Proofs of Ergodicity Theorem and some of the Coupling-related results we used last time.

Sampling spanning trees

Source: Vigoda lecture notes (lectures 4-6,8-9)

__Lecture 3:__

Perfect Sampling: Coupling from the past, and Huber’s Bounding Chains and their use for perfectly sampling k-colorings (for k> (Delta+1)^2)

Source: Vigoda lecture notes (lecture 7) and Huber’s Bounding Chains paper (“Perfect sampling using bounding chains”, here)

__Lecture 4:__

Estimating the Volume of a Convex Body:

The Ball walk; Specialized Isoperimetric Inequalities; Proof of rapid convergence using conductance

Relation to Volume Estimation

Source: Vempala’s Survey on geometric random walks

Lecture 5:

The method of canonical paths, and its application for sampling matchings in a graph

Source: Jerrum book chapter 5 (but there are good treatments also in Sinclair and Randall’s lecture notes)

__Lecture 6:__

The connection between expanders and MCMC. Spectral expansion, Edge-expansions, their relation, what does good expansion imply (fast mixing, good derandomization).

Source: The Derandomization part can be read in the Dwork-Harsha lecture notes on expanders (see lecture 4). The part about expansion implying good mixing can be the mixing part can be read here. In general Luca Trevisan is teaching a course about the relation of expanders, MCMC, and approximation algorithms for sparsest cut. See in the source list above.

Lecture 7:

To come.