Keep in mind, this creationist, despite his epistemic problems, has manages so far not to die from believing that "when they drink deadly poison, it will not hurt them at all".
Not killing yourself in such grossly obvious ways is pretty easy (very few people, even creationists, let their abstract beliefs control their actions that much), and doesn't seem at all relevant to me.
maybe even enough that he thinks that saving an extra billion lives is worth cooperating with an atheist
I'm sure he already thinks that, not being an alien monster and all – his problem is epistemic, not moral.
(so long as the atheist is likewise rational enough to save an extra billion lives by cooperating with a creationist)
So long as the creationist thinks that, you mean. Again, he's almost certainly not aware of superrationality, so I should persuade him to cooperate however I can, then defect. (Modulo the possibility that privately precommitting to cooperate could make me more persuasive, but on casual introspection I doubt I could actually do that.)
In the unlikely event the creationist is superrational, I expect we'd both start out trying to persuade each other, so we could notice the symmetry, mutually determine that we're superrational (since causal decision theorists could also start out persuading), and both cooperate (resulting in a worse outcome than if he hadn't been superrational).
Not killing yourself in such grossly obvious ways is pretty easy (very few people, even creationists, let their abstract beliefs control their actions that much), and doesn't seem at all relevant to me.
You seriously think that the fact that the creationist doesn't let his abstract belief control his actions is not relevant to the question of whether he will let his abstract belief control his actions? The point is, he has ways of overcoming the foolishness of his beliefs when faced with an important problem.
...I'm sure he already thinks that, not being a
I spoke yesterday of the epistemic prisoner's dilemma, and JGWeissman wrote:
To which I said:
And lo, JGWeissman saved me a lot of writing when he replied thus:
I make one small modification. You and your creationist friend are actually not that concerned about money, being distracted by the massive meteor about to strike the earth from an unknown direction. Fortunately, Omega is promising to protect limited portions of the globe, based on your decisions (I think you've all seen enough PDs that I can leave the numbers as an excercise).
It is this then which I call the true epistemic prisoner's dilemma. If I tell you a story about two doctors, even if I tell you to put yourself in the shoes of one, and not the other, it is easy for you to take yourself outside them, see the symmetry and say "the doctors should cooperate". I hope I have now broken some of that emotional symmetry.
As Omega lead the creationist to the other room, you would (I know I certainly would) make a convulsive effort to convince him of the truth of evolution. Despite every pointless, futile argument you've ever had in an IRC room or a YouTube thread, you would struggle desperately, calling out every half-remembered fragment of Dawkins or Sagan you could muster, in hope that just before the door shut, the creationist would hold it open and say "You're right, I was wrong. You defect, I'll cooperate -- let's save the world together."
But of course, you would fail. And the door would shut, and you would grit your teeth, and curse 2000 years of screamingly bad epistemic hygiene, and weep bitterly for the people who might die in a few hours because of your counterpart's ignorance. And then -- I hope -- you would cooperate.