PhilipL comments on The Strangest Thing An AI Could Tell You - Less Wrong

81 Post author: Eliezer_Yudkowsky 15 July 2009 02:27AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (574)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 10 November 2012 01:36:28AM 5 points [-]

Ted Chiang wrote a one-page short story, What's Expected of Us, about basically this, and it's scary. (pdf)

Comment author: Will_Sawin 10 November 2012 06:27:53AM 4 points [-]

This story struck me as more silly than scary.

Comment author: [deleted] 10 November 2012 12:42:56PM 2 points [-]

My reaction time is less than a second; what happens if I decide to press the button as soon as I hear a Geiger counter click?

Comment author: satt 10 November 2012 02:59:38PM 3 points [-]

You find out whether Geiger counters have free will.

Comment author: pragmatist 10 November 2012 03:10:33PM *  0 points [-]

If the Predictor continues to work in this circumstance, it would be evidence against MWI, since on MWI there are two futures -- one in which you push the button and one in which you don't -- that both presumably send signals back to the Predictor. Since only one of these signals can determine the Predictor's behavior, it will get the prediction wrong for some branches. Consistently finding that you are not in one of these branches becomes more and more improbable as the number of trials increases.

Comment author: fubarobfusco 10 November 2012 05:11:00AM 1 point [-]

It seems like the sort of thing that once upon a time someone could have written about souls instead of free will.