khafra comments on Modest Superintelligences - Less Wrong

20 Post author: Wei_Dai 22 March 2012 12:29AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (88)

You are viewing a single comment's thread. Show more comments above.

Comment author: khafra 27 March 2012 03:49:57PM 1 point [-]

That part seems less controversial than the others--no Reformation and no Enlightenment means no science, which means no GAI, which means no uFAI. Although I'm sure there's multiple disjunctive ways science might have come about, it is rather strange that it took humanity so long.

As it is, LW--or at least Yudkowsky--believes uFAI is more likely than FAI, and is the most concerning current existential risk.

I don't really see how retaining the divine right of kings would have forestalled all the other existential risks, though.