Stuart_Armstrong comments on The True Prisoner's Dilemma - Less Wrong

53 Post author: Eliezer_Yudkowsky 03 September 2008 09:34PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (112)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Stuart_Armstrong 04 September 2008 01:06:55PM -2 points [-]

Despite the disguise, I think this is the same as the standard PD. In there (assuming full utilities, etc...), the obvious ideal for an impartial observer is to pick (C,C) as the best option, and for the prisoner to pick (D,C).

Here, (D,C) is "righter" than (C,C), but that's simply because we are no longer impartial obervers; humans shouldn't remain impartial when billions of lives are at stake. We are all in the role of "prisoners" in this situation, even as observers.

An "impartial observer" would simply be one that valued one billion human lives the same as one paper clip. They would see us as a simple prisoner, in the same situation as the standard PD, with the same overall solution - (C,C).

Comment author: RobbBB 03 February 2014 12:16:33PM *  1 point [-]

This is an old post and probably very out of date, but: I think if you try to define an impartial observer's preferences as whatever selects (C,C) in two other agents' PD, you get inconsistencies very rapidly once you have one of those agents stuck in two Prisoner's Dilemmas at once.

I also don't think we should use euphemisms like 'impartial' for an incredibly partial Cooperation Fetishist that's willing to give up everything else of value (e.g., billions of human lives) to go through the motions of satisfying non-sentient processes like sea slugs or paperclip maximizers.

Comment author: Stuart_Armstrong 03 February 2014 12:41:28PM 1 point [-]

you get inconsistencies very rapidly once you have one of those agents stuck in two Prisoner's Dilemmas at once.

Multi-player interactions are tricky and we don't have a good solution for them yet.

that's willing to give up everything else of value (e.g., billions of human lives)

It's not that its willing to give up everything of value - it's that it doesn't have our values. Without sharing our values, there's no reason for it to prefer our opinions over sea slugs.