Entropy and Counting

Entropy is a notion coming from information theory; it was introduced by Claude Shannon in the 1940s. Entropy quantifies the expected amount of information contained in a realisation of a discrete random variable. In particular, if X is a uniformly chosen random element of a finite set S, then the entropy of X is the logarithm of |S|. This allows one to rephrase counting problems in the language of entropy. Such rephrasing opens up the possibility of applying several tools from information theory to studying counting problems in combinatorics; in the recent years, such approach was applied very successfully to a range of enumeration problems.

In this course, we will introduce the notion of entropy and derive several useful identities and inequalities relating entropies. We will then demonstrate the power of these inequalities by presenting several applications of the ‘entropy method’ in combinatorics.