Review

This is simply because AIs run on electricity.

Our brains use sunlight via photosynthesis via dietary energy intake. From sunlight to dietary energy this is about 0.25-0.5% energy efficient. Let's say it's 0.35% efficient. (This is then our complete sunlight-to-capability efficiency for a human)

AI systems use sunlight via solar energy via electrical consumption. From sunlight to GPU this is about 10-18% efficient let's say it's 13% efficient. (Data: typical solar panel efficiency = 15-20% efficient. Typical electricity distribution efficiency = 90% efficient). 13% is then AI sunlight-to-electricity efficiency, but not yet sunlight-to-capability efficiency.

Let's calculate the final part. We need to assume an amount of compute for a human-equivalent AI system. I will assume that (note, at inference time) we can run a human-equivalent AI system on 10 NVIDIA 4070 GPUs at full power, which each consume ~200 Watts, so 2 kW in total. In contrast the typical human consumes 100 watts. 100W divided by 2 kW gives us 5% as AI electricity-to-capability efficiency, which we then multiply to get the total AI efficiency. 5% electricity-to-capability efficiency times 13% sunlight-to-electricity efficiency = 0.65% sunlight-to-capability efficiency.

So humans are 0.35% efficient, and AIs are 0.65% efficient.

By these assumptions, AI is somewhat more efficient in terms of real energy input - here, energy from the sun. This is the number that the economy is going to care about - why use land for crops, when you can use it for solar panels?

Please pick apart my numbers and assumptions in the comments. Thanks.

Review

11

New Comment
7 comments, sorted by Click to highlight new comments since:

The conclusion seems rather to be "human metabolism is less efficient than solar panels," which, while perhaps true, has limited bearing on the question of whether or not the brain is thermodynamically efficient as a computer when compared to current or future AI. The latter is the question that recent discussion has been focused on, and to which the "No - " in the title makes it seem like you're responding.

Moreover, while a quick Google search turns up 100W as the average resting power output of a person, another search suggests the brain is only responsible for about 20% of energy consumption per time. Adding this to your analysis gives .13% "efficiency" in the sense that you're using it, so the brain still outperforms AI even on this admittedly rather odd sunlight-to-capability metric.

Well, yes, the point of my post is just to point out that the number that actually matters is the end-to-end energy efficiency — and it is completely comparable to humans.

The per-flop efficiency is obviously worse. But, that's irrelevant if AI is already cheaper for a given task in real terms.

I admit the title is a little clickbaity but i am responding to a real argument (that humans are still "superior" to AI because the brain is more thermodynamically efficient per-flop)

I made a similar point (but without specific numbers - great to have them!) in a comment https://www.lesswrong.com/posts/Lwy7XKsDEEkjskZ77/?commentId=nQYirfRzhpgdfF775 on a post that posited human brain energy efficiency over AIs as a core anti-doom argument, and I also think that the energy efficiency comparisons are not particularly relevant either way:

Humanity is generating and consuming enormous amount of power - why is the power budget even relevant? And even if it was, energy for running brains ultimately comes from Sun - if you include the agriculture energy chain, and "grade" the energy efficiency of brains by the amount of solar energy it ultimately takes to power a brain, AI definitely has a potential to be more efficient. And even if a single human brain is fairly efficient, the human civilization is clearly not. With AI, you can quickly scale up the amount of compute you use, but scaling beyond a single brain is very inefficient.

If you go this way, you have to include the energy cost of growing a human on the one hand and building and deploying the solar cells and the chip factories on the other.

Yes, that's fair. I was ignoring scale but you're right that it's a better comparison if it is between a marginal new human and a marginal new AI.

Can't you make human food production a lot more efficient with biotech? Algae, for instance? Spirulina maybe? Tastes bitter, grows fast, highly nutritious. (Are plants or algae as efficient at generating sugars from sunlight as new forms of life evolved to directly use electricity from a solar panel would be?)

Even if that wasn't practical for humans, if such an organism would be very easily imaginable I think that still gives us some weird biopunk menu options for the medium-term future of intelligence?

I saw some numbers for algae being 1-2% efficient but it was for biomass rather than dietary energy. Even if you put the brain in the same organism, you wouldn't expect as good efficiency as that. The difference is that creating biomass (which is mostly long chains of glucose) is the first step, and then the brain must use the glucose, which is a second lossy step.

But I mean there is definitely far-future biopunk options eg. I'd guess it's easy to create some kind of solar panel organism which grows silicon crystals instead of using chlorophyll.