I spoke yesterday of the epistemic prisoner's dilemma, and JGWeissman wrote:
I am having some difficulty imagining that I am 99% sure of something, but I cannot either convince a person to outright agree with me or accept that he is uncertain and therefore should make the choice that would help more if it is right, but I could convince that same person to cooperate in the prisoner's dilemma. However, if I did find myself in that situation, I would cooperate.
To which I said:
Do you think you could convince a young-earth creationist to cooperate in the prisoner's dilemma?
And lo, JGWeissman saved me a lot of writing when he replied thus:
Good point. I probably could. I expect that the young-earth creationist has a huge bias that does not have to interfere with reasoning about the prisoner's dilemma.
So, suppose Omega finds a young-earth creationist and an atheist, and plays the following game with them. They will each be taken to a separate room, where the atheist will choose between each of them receiving $10000 if the earth is less than 1 million years old or each receiving $5000 if the earth is more than 1 million years old, and the young earth creationist will have a similar choice with the payoffs reversed. Now, with prisoner's dilemma tied to the young earth creationist's bias, would I, in the role of the atheist still be able to convince him to cooperate? I don't know. I am not sure how much the need to believe that the earth is around 5000 years would interfere with recognizing that it is in his interest to choose the payoff for earth being over a million years old. But still, if he seemed able to accept it, I would cooperate.
I make one small modification. You and your creationist friend are actually not that concerned about money, being distracted by the massive meteor about to strike the earth from an unknown direction. Fortunately, Omega is promising to protect limited portions of the globe, based on your decisions (I think you've all seen enough PDs that I can leave the numbers as an excercise).
It is this then which I call the true epistemic prisoner's dilemma. If I tell you a story about two doctors, even if I tell you to put yourself in the shoes of one, and not the other, it is easy for you to take yourself outside them, see the symmetry and say "the doctors should cooperate". I hope I have now broken some of that emotional symmetry.
As Omega lead the creationist to the other room, you would (I know I certainly would) make a convulsive effort to convince him of the truth of evolution. Despite every pointless, futile argument you've ever had in an IRC room or a YouTube thread, you would struggle desperately, calling out every half-remembered fragment of Dawkins or Sagan you could muster, in hope that just before the door shut, the creationist would hold it open and say "You're right, I was wrong. You defect, I'll cooperate -- let's save the world together."
But of course, you would fail. And the door would shut, and you would grit your teeth, and curse 2000 years of screamingly bad epistemic hygiene, and weep bitterly for the people who might die in a few hours because of your counterpart's ignorance. And then -- I hope -- you would cooperate.
With this attitude, you won't be able to convince him. He'll expect you to defect, no matter what you say. It's obvious to you what you'll do, and it's obvious for him. By refusing to save a billion people, and instead choosing the meaningless alternative option, you perform an instrumental action that results in your opponent saving 2 billion people. You control the other player indirectly.
Choosing the option other than saving 1 billion people doesn't have any terminal value, but it does have instrumental value, more of it than there is in directly saving 1 billion people.
This is not to say that you can place this kind of trust easily, for humans you may indeed require making a tangible precommitment. Humans are by default broken, in some situations you don't expect the right actions from them, the way you don't expect the right actions from rocks. An external precommitment is a crutch that compensates for the inborn ailments.
Um... are you asserting that deception between humans is impossible?