Qiaochu_Yuan comments on Bayesians vs. Barbarians - Less Wrong

51 Post author: Eliezer_Yudkowsky 14 April 2009 11:45PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (270)

You are viewing a single comment's thread. Show more comments above.

Comment author: Qiaochu_Yuan 10 May 2013 09:45:44PM *  0 points [-]

That isn't an assumption Eliezer is making, it's an assumption he's attacking.

Comment author: Squark 11 May 2013 07:04:19PM 0 points [-]

It doesn't look like it:

Imagine a community of self-modifying AIs who collectively prefer fighting to surrender, but individually prefer being a civilian to fighting. One solution is to run a lottery, unpredictable to any agent, to select warriors. Before the lottery is run, all the AIs change their code, in advance, so that if selected they will fight as a warrior in the most communally efficient possible way—even if it means calmly marching into their own death...

You can have lotteries for who gets elected as a warrior. Sort of like the example above with AIs changing their own code. Except that if "be reflectively consistent; do that which you would precommit to do" is not sufficient motivation for humans to obey the lottery, then...

...well, in advance of the lottery actually running, we can perhaps all agree that it is a good idea to give the selectees drugs that will induce extra courage, and shoot them if they run away. Even considering that we ourselves might be selected in the lottery. Because in advance of the lottery, this is the general policy that gives us the highest expectation of survival.

Eliezer is analyzing the situation as a Prisoner's Dilemma: different players have different utility functions. This analysis would be completely redundant in a society where everyone have the same utility function (or at least sufficiently similar / non-egocentric utility functions). In such a society there wouldn't be a need for a lottery: the soldiers would be those most skilled for the job. There would be no need for drugs / shooting deserters: the soldiers would want to fight because the choice to fight would be associated with positive expected utility (even if it means high likelihood of death).