from to
See also:

Presentation

Shannon entropy, defined for a discrete probability distribution, is a fundamental concept in information theory. It is also a surprisingly useful tool for proving theorems in Combinatorics. This lecture will present several uses of this technique. The first half will be devoted to extremal combinatorics, the second to additive combinatorics. Two recent breakthroughs will be highlighted. The first is a major advance towards the union-stable families conjecture, which asserts that any finite family of union-stable sets contains an element present in at least half the sets. The second is a proof of Marton's conjecture, which provides an optimal quantitative version of Freiman's theorem on the structure of F_2^n subsets with a small sum