wedrifid comments on Costs to (potentially) eternal life - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (107)
I was the lead developer of an AGI that is scheduled to hit start in three weeks. I quit when I saw that the 'Friendliness' intended is actually a dystopia and my protested were suppressed. I have just cancelled my cryonics membership and the reason your cryonic revival is dependent on killing me is that I am planning to sabotage the AI.
A catch all: Humans can always say with sincerity that they would never do something so immoral under any circumstances without it necessarily changing their behaviour in the moment.
Is it weird that my first reaction is to ask her specific questions about the Sequences to test the likelihood of that statement's veracity?