Direct PDF link for non-subscribers
Information theory must precede probability theory, and not be based on it. By the
very essence of this discipline, the foundations of information theory have a finite combinatorial character.- Andrey Kolmogorov
Many alignment researchers borrow intuitions from thermodynamics: entropy relates to information, which relates to learning and epistemology. These connections were first revealed by Szilárd's resolution of Maxwell's famous thought experiment. However, the classical tools of equilibrium thermodynamics are not ideally suited to studying information processing far from equilibrium.
This new work reframes thermodynamics in terms of the algorithmic entropy. It takes an information-first approach, delaying the introduction of physical concepts such as energy and temperature until after the foundations are set. I find this approach more conceptually principled and elegant than the traditional alternatives.
It's based on a 30-year-old workshop paper, which until now was largely abandoned. Roughly speaking, the algorithmic entropy of a physical state is its Kolmogorov complexity; that is, the length of the shortest program that outputs a precise description of its microscopic configuration (to some bounded precision). This definition does away with probability distributions and macrovariables, and satisfies very general laws!
The paper is long, in part because I tried to make it self-contained. If you find yourself using entropy in a setting that is not described by a large number of identically distributed variables, then consider reframing your intuitions in terms of the algorithmic entropy!
These recordings I watched were actually from 2022 and weren't the Sante Fe ones.