A 100kT signal Is only reliable for a distance of a few nanometers. The energy cost is all in pushing signals through wires. So the synapse signal is a million times larger than 100kT to cross a distance of around 1 mm or so, which works out to 10^-13 J per synaptic event. Thus 10 watts for 10^14 synapses and a 1 hz rate. For a 100 hz rate, the average dist would need to be less.
At some point soon, I'm going to attempt to steelman the position of those who reject the AI risk thesis, to see if it can be made solid. Here, I'm just asking if people can link to the most convincing arguments they've found against AI risk.
EDIT: Thanks for all the contribution! Keep them coming...