Sorry I'm late, Anywhere this seems a good place to post my two (not quite) colloaries to the original post:
colloary 1: You can chose either a or b: a) All currently alive humans, including you, will be tortured with superhuman proficiency for a billion years, with certainty. b) There is a 1-in-1 000 000 risk (otherwise nothing happens) that 3^^^3 animals get dust specks in their eyes. These animals have mental attributes that makes them on average worth approximately 1/10^12 as much as a human- Further, the dust specks are so small only those with especially sensitive eyes (about 1 in a million) can even notice it.
not--a-colloary 2: Choices are as follows: a) nothing happens b) 3^^^3 humans get tortured for 3^^^3 years, and there's a 1/3^^^3 chance a friendly Ai is released into our universe and turns out to be able to travel to any number of other universe and persist in the multiverse creating Fun for eternity.
Today's post, Torture vs. Dust Specks was originally published on 30 October 2007. A summary (taken from the LW wiki):
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Motivated Stopping and Motivated Continuation, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.