wedrifid comments on Costs to (potentially) eternal life - Less Wrong

8 Post author: bgrah449 21 January 2010 09:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (107)

You are viewing a single comment's thread.

Comment author: wedrifid 21 January 2010 11:41:13PM *  7 points [-]

What does she say that convinces you?

  • I am wired with explosives triggered by an internal heart rate monitor.
  • My husband, right next to me, is 100 kg of raw muscle and armed.
  • I was the lead developer of an AGI that is scheduled to hit start in three weeks. I quit when I saw that the 'Friendliness' intended is actually a dystopia and my protested were suppressed. I have just cancelled my cryonics membership and the reason your cryonic revival is dependent on killing me is that I am planning to sabotage the AI.

  • A catch all: Humans can always say with sincerity that they would never do something so immoral under any circumstances without it necessarily changing their behaviour in the moment.

  • Awareness of the above tendency in oneself often comes with the (necessary) willingness to lie explicitly lie about their values for the same reasons that they would otherwise have lied to themselves.
  • Related to the above, it is the natural instinct to speak out in outrage against anyone who doesn't condemn such immoral actions or even those who don't imply that the answer should be known a priori.
  • This plays a part in the votes your post has received, which is unfortunate. I thank you for making it and hope the magnified downvotes do not put you under the threshold for posting.
Comment author: Technologos 22 January 2010 02:58:34AM 2 points [-]

I was the lead developer of an AGI that is scheduled to hit start in three weeks. I quit when I saw that the 'Friendliness' intended is actually a dystopia and my protests were suppressed. I have just cancelled my cryonics membership and the reason your cryonic revival is dependent on killing me is that I am planning to sabotage the AI.

Is it weird that my first reaction is to ask her specific questions about the Sequences to test the likelihood of that statement's veracity?