EuanMcLean

comms specialist at FAR AI

Sequences

Big Picture AI Safety

Wiki Contributions

Comments

Interesting question! I'm afraid I didn't probe the cruxes of those who don't expect hard takeoff. But my guess is that you're right - no hard takeoff ~= the most transformative effects happen before recursive self-improvement

Yea, I think you're hitting on a weird duality between setting and erasing here. I think I agree that setting is more fundamental than erasing. I suppose when talking about energy expenditure of computation, each set bit must be erased in the long run, so they're interchangeable in that sense.

Sorry for the delay. As both you and TheMcDouglas have mentioned; yea, this relies on $H(C|X) = 0$. The way I've worded it above is somewhere between misleading and wrong, have modified. Thanks for pointing this out!

Thanks for the comment, this is indeed an important component! I've added a couple of sentences pointing in this direction.