Newcomb versus dust specks

-1 ike 12 May 2016 03:02AM

You're given the option to torture everyone in the universe, or inflict a dust speck on everyone in the universe. Either you are the only one in the universe, or there are 3^^^3 perfect copies of you (far enough apart that you will never meet.) In the latter case, all copies of you are chosen, and all make the same choice. (Edit: if they choose specks, each person gets one dust speck. This was not meant to be ambiguous.)

As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.

What do you do?

How does your answer change if the predictor made the copies of you conditional on their prediction?

How does your answer change if, in addition to that, you're told you are the original?

The guardian article on longevity research [link]

8 ike 11 January 2015 07:02PM

Discussion of AI control over at worldbuilding.stackexchange [LINK]

6 ike 14 December 2014 02:59AM

https://worldbuilding.stackexchange.com/questions/6340/the-challenge-of-controlling-a-powerful-ai

Go insert some rationality into the discussion! (There are actually some pretty good comments in there, and some links to the right places, including LW).

Rodney Brooks talks about Evil AI and mentions MIRI [LINK]

3 ike 12 November 2014 04:50AM

Rodney Brooks says that "evil" AI is not a big problem:
http://www.rethinkrobotics.com/artificial-intelligence-tool-threat/