In honor of !!con, I was trying to think of programming-relevant uses of double exclamation marks. Other than denoting the end of a particularly funny programming joke (for example, a mangled homage to a famous paper), it seemed the best place to look might be in the world of double factorials. As it turns out, that wasn't as fruitful as I'd hoped, but I learned something interesting on the way to finding that out, and here it is.

# TL;DR

You can learn interesting things about the order of complexity of an algorithm without knowing anything about how it works.

# Informational entropy

In statistical mechanics, the entropy $S$ of a physical system that can exist in any of multiple states $i$, with probability $p_i$ is

$$S = - k_B \sum_i p_i \ln p_i$$

where $k_B$ is the Boltzmann constant. If you're not a physics person, don't worry about $k_B$, or about why it turns out that this is the same entropy that relates temperature $T$ to work $Q$:

$$\delta Q = T \delta S$$

This thermodynamic relationship tells us that the amount of work you have to do to move a smidgen of entropy $\delta S$ around is proportional to $\delta S$.

Claude Shannon defined informational entropy for a variable with multiple discrete values $i$ that it can assume with probability $p_i$ in almost the same way,

$$S = - \sum_i p_i \log p_i$$

the difference being a trivial multiplicative constant, since we're not specific about the base of the $\log$, and we've dropped the Boltzmann constant (which is an artifact of the units we use to measure energy).