MADALGO theory seminar, Suresh Venkatasubramanian, University of Utah

2014.05.14 |

Date | Wed 21 May |

Time | 14:15 — 15:00 |

Location |

Speaker: Suresh Venkatasubramanian

Location: Nygaard 395

Title: A directed isoperimetric inequality with application to Bregman near neighbor lower bounds

Abstract:

The \emph{Bregman divergences} $D_\phi$ are a class of distance measures parametrized by a convex function $\phi$. They include well known distance functions like $\ell_2^2$ and the Kullback-Leibler divergence from information theory and are used extensively in data-driven applications such as machine laerning, computer vision, text mining, and speech processing. There has been extensive research on algorithms for problems like clustering and near neighbor search with respect to Bregman divergences; in all cases, the algorithms depend not just on standard parameters like the data size $n$, dimensionality $d$, number of probes allowed $t$, and error $\varepsilon$, but also on a \emph{structure constant} $\mu \ge 1$ that depends solely on $\phi$ and can grow without bound independent of the other parameters. This dependence has withstood attempts to remove it, and appears to be intrinsic to the study of these measures.

In this paper, we provide the first evidence that this dependence might be intrinsic. In particular, we focus on the problem of approximate near neighbor search for Bregman divergences, a problem of interest in its own right. We show that under the cell probe model, any non-adaptive data structure (like locality-sensitive hashing) for $c$-approximate near-neighbor search that admits $r$ probes must use space $\Omega(n^{1 + \frac{\mu}{c r}})$. In contrast, for LSH under $\ell_1$ the best bound is $\Omega(n^{1+\frac{1}{cr}})$.

In proving this lower bound, we follow and extend the Fourier-analytic approach that has yielded many lower bounds for such problems. Our new tool is a directed variant of the standard boolean noise operator, for which we prove a generalization of the Bonami-Beckner hypercontractivity inequality. While such an inequality cannot exist in general, we show that it exists ``in expectation'' or upon restriction to certain subsets of the Hamming cube, and that this is sufficient to prove the desired isoperimetric inequality that we use in our data structure lower bound. This operator might be of independent interest in the analysis of boolean functions.

We also present a structural result reducing the Hamming cube to a \emph{Bregman cube}. This structure allows us to obtain lower bounds for problems under Bregman divergences from their $\ell_1$ analog. In particular, we get a (weaker) lower bound for approximate near neighbor search of the form $\Omega(n^{1 + \frac{1}{cr}})$ for an $r$-query non-adaptive data structure, and new cell probe lower bounds for a number of other related near neighbor questions in Bregman space.

Joint work with Amirali Abdullah