J_Taylor comments on Holden Karnofsky's Singularity Institute Objection 1 - Less Wrong

8 Post author: ciphergoth 11 May 2012 07:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (60)

You are viewing a single comment's thread. Show more comments above.

Comment author: J_Taylor 13 May 2012 10:58:43PM 1 point [-]

Although I am extremely interested in your theories, it would take significant time and energy for me to reformulate my ideas in such a way as to satisfactorily incorporate the points you are making. As such, for purposes of this discussion, I shall be essentially speaking as if I had not been made aware of the post which you just made.

However, if you could clarify a minor point: am I mistaken in my belief that it is the SI's position that uFAI will probably result in human extinction? Or, have they incorporated the points you are making into their theories?

Comment author: Will_Newsome 14 May 2012 01:18:19AM *  2 points [-]

I know that Anna at least has explicitly taken such considerations into account and agrees with them to some extent. Carl likely has as well. I don't know about Eliezer or Luke, I'll ask Luke next time I see him. ETA: That is, I know Carl and Anna have considered the points in my first paragraph, but I don't know how thoroughly they've explored the classes of scenarios like those in my second paragraph which are a lot more speculative.

Comment author: Will_Newsome 15 May 2012 10:00:39PM *  1 point [-]

Eliezer replied here, but it seems he's only addressed one part of my argument thus far. I personally think the alien superintelligence variation of my argument, which Eliezer didn't address, is the strongest, because it's well-grounded in known physical facts, unlike simulation-based speculation.