James_Miller comments on AALWA: Ask any LessWronger anything - Less Wrong

28 Post author: Will_Newsome 12 January 2014 02:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (611)

You are viewing a single comment's thread. Show more comments above.

Comment author: James_Miller 12 January 2014 07:32:03PM 9 points [-]

I have thought a lot about this. Possible reasons: most humans don't care about the far future or people who are not yet born, most things that seem absurd are absurd and are not worth investigating and the singularity certainly superficially seems absurd, the vast majority is right and you and I are incorrect to worry about a singularity, it's impossible for people to imagine an intelligence AI that doesn't have human-like emotions, the Fermi paradox implies that civilizations such as ours are not going to be able to rationally think about the far future, and an ultra-AI would be a god and so is disallowed by most peoples' religious beliefs.

Your question is related to why so few signup for cryonics.

Comment author: NancyLebovitz 12 January 2014 08:35:27PM 5 points [-]

I don't know about anyone else, but I find it hard to believe that provable Friendliness is possible.

On the other hand, I think high-probability Friendliness might be possible.

Comment author: JoshuaFox 12 January 2014 07:43:18PM 2 points [-]

I agree with you that a lot of people think that way, but I have spoken to quite a few smart people who understand all the points -- I probe to figure out if there are any major inferential gaps -- and they still don't get on the bandwagon.

Another point is simply that we cannot all devote time to all important things; they simply choose not to prioritize this.