Even if a computation goes forever, it doesn't necessarily perform more than a certain finite amount of computation. And when we are below the Planck's temperature scale, the further cooling is useless for your heat driven machines. Life stops.

I believe that there is a lot of computing down there in the coldness around the absolute zero. But not an infinite amount.

Thomas's comment seems quite sensible to me.

It seems to me that Dyson's argument was that as temperature falls, so does the energy required for computing. So, the point in time when we run out of available energy to compute diverges. But, Thomas reasonably points out (I think - correct me if I am misrepresenting you Thomas) that as temperature falls and the energy used for computing falls, so does the

speedof computation, and so theamountof computation that can be performed converges, even if we were to compute forever.Also, isn't Thomas correct that Planck's constant puts an absolute minimum on the amount of energy required for computation?

These seem like perfectly reasonable responses to Dyson's comments. What am I missing?

you are missing the concept of blather