steven0461 comments on Modest Superintelligences - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (88)
What do you see as the causal connection here?
That part seems less controversial than the others--no Reformation and no Enlightenment means no science, which means no GAI, which means no uFAI. Although I'm sure there's multiple disjunctive ways science might have come about, it is rather strange that it took humanity so long.
As it is, LW--or at least Yudkowsky--believes uFAI is more likely than FAI, and is the most concerning current existential risk.
I don't really see how retaining the divine right of kings would have forestalled all the other existential risks, though.