Introduction
This post is written as a response to jacob_cannel's recent post Contra Yudkowsky on AI Doom. He writes:
EY correctly recognizes that thermodynamic efficiency is a key metric for computation/intelligence, and he confidently, brazenly claims (as of late 2021), that the brain is about 6 OOM from thermodynamic efficiency limits
[...]
EY is just completely out of his depth here: he doesn't seem to understand how the Landauer limit actually works, doesn't seem to understand that synapses are analog MACs which minimally require OOMs more energy than simple binary switches, doesn't seem to understand that interconnect dominates energy usage regardless, etc.
Most of Jacob's analysis for brain efficiency is contained in this post: Brain Efficiency: Much More than You Wanted to Know. I believe this analysis is flawed with respect to the thermodynamic energy efficiency of the brain. That's the scope of this post: I will respond to Jacob's claims about thermodynamic limits on brain energy efficiency. Other constraints are out of scope, as is a discussion of the rest of the analysis in Brain Efficiency.
The Landauer limit
Just to review quickly, the Landauer limit says that erasing 1 bit of information has an energy cost of . This energy must be dissipated as heat into the environment. Here is Boltzmann's constant, while is the temperature of the environment. At room temperature, this is about eV.
Erasing a bit is something that you have to do quite often in many types of computations, and the more bit erasures your computation needs, the more energy it costs to do that computation. (To give a general sense of how many erasures are needed to do a given amount of computation: If we add -bit numbers and to get , and then throw away the original values of and , that costs bit erasures. I.e. the energy cost is .)
Extra reliability costs?
Brain Efficiency claims that the energy dissipation required to erase a bit becomes many times larger when we try to erase the bit reliably.
The key transition error probability is constrained by the bit energy: Here's a range of bit energies and corresponding minimal room temp switch error rates (in electronvolts):
- α=0.49, Eb=0.02eV
- α=0.01, Eb=0.1eV
- α=, Eb=1eV
This adds a factor of about 50 to the energy cost of erasing a bit, so this would be quite significant if true. To back up this claim, Jacob cites this paper by Michael P. Frank. The relevant equation is pulled from section 2. However, in that entire section, Frank is temporarily assuming that the energy used to represent the bit internally is entirely dissipated when it comes time for the bit to be erased. Dissipating that entire energy is not required by the laws of physics, however. Frank himself explicitly mentions this in the paper (see section 3): The energy used to represent the bit can be partially recovered when erasing it. Only must actually be dissipated when erasing a bit, even if we ask for very high reliability.
(I originally became suspicious of Jacob's numbers here based on a direct calculation. Details in this comment for those interested.)
Analog signals?
Quoting Brain Efficiency:
Analog operations are implemented by a large number of quantal/binary carrier units; with the binary precision equivalent to the signal to noise ratio where the noise follows a binomial distribution.
Because of this analog representation, Jacob estimates about 6000 eV required to do the equivalent of an 8 bit multiplication. However, the laws of physics don't require us to do our floating point operations in analog. "are implemented" does not imply "have to be implemented". Digital multiplication of two 8 bit values has a minimum cost of less than 2*8*(0.02 eV) = 0.32 eV.
What if there's something special about the inherently noisy nature of analog? Maybe the random fluctuations in signal strength are actually important in the brain's proper functioning? That's fine too: Just as the Landauer limit imposes an energy cost on erasing a bit, absorbing a random bit from the environment's thermal randomness comes with an energy bonus. If we have to add randomness onto our signals, that actually improves the overall energy efficiency.
Dissipative interconnect?
Jacob claims that there are unavoidable energy costs to "interconnect", all the wiring that allows information to travel between different parts of the brain.
Moving a bit from one place to another does not have a minimum cost according to thermodynamics. Erasing a bit costs energy, but merely moving a bit from one place to another doesn't count as erasing it. This should be intuitively obvious if you think about it for a little while. Imagine a hard drive full of data drifting through space at a constant velocity. No matter how far it carries that data, there's no energy cost. Even the energy used to accelerate the drive can be recovered at the end when slowing it down.
To his credit, Jacob does acknowledge that non-dissipative interconnect is conceivable:
For long distance interconnect or communication reversible (ie optical) signaling is obviously vastly superior in asymptotic energy efficiency, but photons and photonics are simply fundamentally too big/bulky/costly due to their ~1000x greater wavelength and thus largely impractical for the dominate on-chip short range interconnects[12]. Reversible signaling for electronic wires requires superconductance, which is even more impractical for the foreseeable future.
Fair enough for optical, but I have no idea why he's dismissing superconductive interconnect as impractical. We already have superconductors that work at liquid nitrogen temperatures, so if you're willing to make a computer design that requires cooling, you don't even need to discover room temperature superconductors.
More generally, the issue here is that we've moved away from thermodynamic limits and into practical engineering constraints. If we want to claim that future inventors (or AIs) could never build a computing device more efficient than the human brain, then an impossibility proof based on thermodynamic limits is a very powerful way to do that because they're based on fundamental physical principles that probably won't be overturned. If we instead claim that it's impossible because to have non-dissipative interconnect they'd have to use optics or superconductors and both seem impractical, then we're relying on future inventors not being able to invent a third kind of non-dissipative interconnect, and also not being able to make either of the known ones practical.
Invoking Thermodynamics
Thermodynamics has a funny way of surviving scientific revolutions that kill other theories. It's applicable to classical mechanics, special relativity, quantum mechanics, and quantum field theory.
If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations - then so much the worse for Maxwell's equations. If it is found to be contradicted by observation - well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it to collapse in deepest humiliation. ― Arthur Eddington
Because of this great reputation thermodynamics has amongst people who know some physics, saying that your ideas are backed up by thermodynamics is almost like saying they've been mathematically proved to be true. People won't literally think you're correct with 100% probability, but pretty darn close.
So consider the following Motte and Bailey: The Motte is "the brain is close to the thermodynamic limits on energy efficiency". The Bailey is a bunch of arguments about engineering difficulty and how the only practical thing might be to use analog signals and dissipative interconnect. Now, even granting that those arguments are correct and the brain is the most efficient computing device that we could build: It will not be because the brain was already close to thermodynamic limits. Rather, it will be because the practical engineering limits turned out to be much tighter than the fundamental thermodynamic limits.
If you're invoking "the laws of thermodynamics", your arguments should generally look like a bunch of claims about energy and entropy and the reversibility of physical laws. They should not depend on things like how soon we're going to discover room temperature superconductors. Because of this, headlining with "the brain is close to the thermodynamic limits on energy efficiency" seems misleading at best.
That's helpful, thankyou.