# Mokshay Madiman : A Shannon-McMillan-Breiman theorem for log-concave measures and applications in convex geometry

Our primary goal is to describe a strong quantitative form of the Shannon-McMillan-Breiman theorem for log-concave probability measures on linear spaces, even in the absence of stationarity. The main technical result is a concentration of measure inequality for the ``information content'' of certain random vectors. We will also briefly discuss implications. In particular, by combining this concentration result with ideas from information theory and convex geometry, we obtain a reverse entropy power inequality for convex measures that generalizes the reverse Brunn-Minkowski inequality of V. Milman. Along the way, we also develop a new information-theoretic formulation of Bourgain's hyperplane conjecture, as well as some Gaussian comparison inequalities for the entropy of log-concave probability measures. This is joint work with Sergey Bobkov (Minnesota).

**Category**: Probability**Duration**: 01:14:52**Date**: April 14, 2011 at 4:10 PM-
**Tags:**seminar, Probability Seminar

## 0 Comments