HungryTurtle comments on Avoiding Your Belief's Real Weak Points - Less Wrong

49 Post author: Eliezer_Yudkowsky 05 October 2007 01:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (203)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Desrtopa 15 April 2012 03:42:14AM *  7 points [-]

Eliezer hasn't argued for the unquestioned rightness of rapid, continual technological innovation. On the contrary, he's argued that scientists should bear some responsibility for the potentially dangerous fruits of their work, rather than handwaving it away with the presumption that the developments can't do any harm, or if they can, it's not their responsibility.

In fact, the primary purpose of the SIAI is to try and get a particular technological development right, because they are convinced that getting it wrong could fuck up everything worse than anything has ever been fucked up.

Comment author: HungryTurtle 18 April 2012 12:20:30PM 0 points [-]

Could you show me where he argues this?

Comment author: Desrtopa 18 April 2012 01:57:18PM 2 points [-]

I'm afraid I don't remember which post he discusses the idea that scientists should worry about the ethics of their work, and I'm having a difficult time finding it. If you want to find that specific post, it might be better to create an open request in a more prominent place and see if anyone else remembers which one it was.

Although it would take a much longer time though, I think it might be a good idea for you to read all the sequences. Eliezer wrote them to bring people up to speed with his position on the development of AI and rationality after all, so that if we are going to continue to have disagreements, at least they can be more meaningful and substantive disagreements, with all of us on the same page. It sounds very much to me like you're pattern matching Eliezer's writing and responding to what you expect him to think, but if his position were such a short hop of inferential distance for most readers, he wouldn't have needed to go to all the work of creating the sequences in the first place.