Probability/ Information Theory
One of the most natural ways to measure the rate of convergence to equilibrium in random systems is the entropy; (or more precisely, the entropy gap between the current position and the equilibrium). Some years ago, Barthe, Naor and I found a new variational characterisation of information and entropy which allows us to use spectral theory to estimate the entropy of marginal distributions and thereby, sums of independent random variables. We obtained the first quantitative measures of entropy growth for sums of random variables belonging to a reasonably broad class. Among other things, the method leads to an analogue of the second law of thermodynamics for the central limit process.
The real plank theorem states that if f1,f2,… are unit functionals on a normed space X, and w1,w2,… are positive numbers with sum at most 1, then there is a point x in the unit ball of X for which |fi(x)| is at least wi every i.
The result generalises the separation theorem and provides a sharp version of the uniform boundedness principle. It can be rephrased as a geometric pigeon-hole principle: if a symmetric convex set is cut with n hyperplanes, there is an uncut copy of the set 1/(n+1) times as large, inside the original set.
Plank methods also yield packings of spheres in high-dimensional space. The plank lemma of T. Bang can be used to weave an ellipsoidal “pancake” of large volume between the lattice points that lie in a large ball. Copies of this pancake can be packed efficiently.