Newcomb versus dust specks
You're given the option to torture everyone in the universe, or inflict a dust speck on everyone in the universe. Either you are the only one in the universe, or there are 3^^^3 perfect copies of you (far enough apart that you will never meet.) In the latter case, all copies of you are chosen, and all make the same choice. (Edit: if they choose specks, each person gets one dust speck. This was not meant to be ambiguous.)
As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.
What do you do?
How does your answer change if the predictor made the copies of you conditional on their prediction?
How does your answer change if, in addition to that, you're told you are the original?
Discussion of AI control over at worldbuilding.stackexchange [LINK]
https://worldbuilding.stackexchange.com/questions/6340/the-challenge-of-controlling-a-powerful-ai
Go insert some rationality into the discussion! (There are actually some pretty good comments in there, and some links to the right places, including LW).
Rodney Brooks talks about Evil AI and mentions MIRI [LINK]
Rodney Brooks says that "evil" AI is not a big problem:
http://www.rethinkrobotics.com/artificial-intelligence-tool-threat/
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)