gwern comments on Greg Egan disses stand-ins for Overcoming Bias, SIAI in new book - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (40)
Can you come up with problem scenarios that don't involve interactions with other intelligent agents that have a significant speed advantage or disadvantage?
Sure, you can eat someone's lunch if you're faster than them, but I'm not sure what this is supposed to tell me about the nature of intelligence.
Existential risks come to mind - even if you ignore the issue of astronomical waste - as setting a lower bound on how stupid lifeforms like us can afford to be.
(If we were some sort of interstellar gas cloud or something which could only be killed by a nearby supernova or collapse of the vacuum or other really rare phenomena, then maybe it wouldn't be so bad to take billions of years to develop in the absence of other optimizers.)