pragmatist comments on Causal decision theory is unsatisfactory - LessWrong

20 Post author: So8res 13 September 2014 05:05PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (158)

You are viewing a single comment's thread. Show more comments above.

Comment author: pragmatist 13 September 2014 08:38:48PM *  1 point [-]

What would you say about the following decision problem (formulated by Andy Egan, I believe)?

You have a strong desire that all psychopaths in the world die. However, your desire to stay alive is stronger, so if you yourself are a psychopath you don't want all psychopaths to die. You are pretty sure, but not certain, that you're not a psychopath. You're presented with a button, which, if pressed, would kill all psychopaths instantly. You are absolutely certain that only a psychopath would press this button. Should you press the button or not?

It seems to me the answer is "Obviously not", precisely because the "off-path" possibility that you're a non-psychopath who pushes the button should not enter into your consideration. But the causal decision algorithm would recommend pushing the button if your prior that you are a psychopath is small enough. Would you agree with that?

Comment author: Jiro 13 September 2014 11:14:57PM 3 points [-]

If only a psychopath would push the button, then your possible non-psychopathic nature limits what decision algorithms you are capable of following.

Comment author: helltank 13 September 2014 11:22:10PM 1 point [-]

Wouldn't the fact that you're even considering pushing the button(because if only a psychopath would push the button then it follows that a non-psychopath would never push the button) indicate that you are a psychopath and therefore you should not push the button?

Another way to put it is:

If you are a psychopath and you push the button, you die. If you are not a psychopath and you push the button, pushing the button would make you a psychopath(since only a psychopath would push), and therefore you die.

Comment author: pragmatist 14 September 2014 05:57:57AM 2 points [-]

Pushing the button can't make you a psychopath. You're either already a psychopath or you're not. If you're not, you will not push the button, although you might consider pushing it.

Comment author: helltank 14 September 2014 12:51:06PM 1 point [-]

Maybe I was unclear.

I'm arguing that the button will never, ever be pushed. If you are NOT a psychopath, you won't push, end of story.

If you ARE A psychopath, you can choose to push or not push.

if you push, that's evidence you are a psychopath. If you are a psychopath, you should not push. Therefore, you will always end up regretting the decision to push.

If you don't push, you don't push and nothing happens.

In all three cases the correct decision is not to push, therefore you should not push.

Comment author: lackofcheese 14 September 2014 01:47:00AM 1 point [-]

Shouldn't you also update your belief towards being a psychopath on the basis that you have a strong desire that all psychopaths in the world die?

Comment author: pragmatist 14 September 2014 05:56:17AM 1 point [-]

You can stipulate this out of the example. Let's say pretty much everyone has the desire that all psychopaths die, but only psychopaths would actually follow through with it.

Comment author: James_Miller 13 September 2014 08:43:30PM 1 point [-]

I don't press. CDT fails here because (I think) it doesn't allow you to update your beliefs based on your own actions.

Comment author: crazy88 14 September 2014 10:05:15PM 2 points [-]

Exactly what information CDT allows you to update your beliefs on is a matter for some debate. You might be interested in a paper by James Joyce (http://www-personal.umich.edu/~jjoyce/papers/rscdt.pdf) on the issue (which was written in response to Egan's paper).

Comment author: pragmatist 13 September 2014 08:46:49PM *  1 point [-]

But then shouldn't you also update your beliefs about what your clone will do based on your own actions in the clone PD case? Your action is very strong (perfect, by stipulation) evidence for his action.

Comment author: James_Miller 13 September 2014 08:58:03PM 1 point [-]

Yes I should. In the psychopath case whether I press the button depends on my beliefs, in contrast in a PD I should defect regardless of my beliefs.

Comment author: pragmatist 13 September 2014 09:13:31PM *  1 point [-]

Maybe I misunderstand what you mean by "updating beliefs based on action". Here's how I interpret it in the psychopath button case: When calculating the expected utility of pushing the button, don't use the prior probability that you're a psychopath in the calculation, use the probability that you're a psychopath conditional on deciding to push the button (which is 1). If you use that conditional probability, then the expected utility of pushing the button is guaranteed to be negative, no matter what the prior probability that you're a psychopath is. Similarly, when calculating the expected utility of not pushing the button, use the probability that you're a psychopath conditional on deciding not to push the button.

But then, applying the same logic to the PD case, you should calculate expected utilities for your actions using probabilities for your clone's action that are conditional on the very action that you are considering. So when you're calculating the expected utility for cooperating, use probabilities for your clone's action conditional on you cooperating (i.e., 1 for the clone cooperating, 0 for the clone defecting). When calculating the expected utility for defecting, use probabilities for your clone's action conditional on you defecting (0 for cooperating, 1 for defecting). If you do things this way, then cooperating ends up having a higher expected utility.

Perhaps another way of putting it is that once you know the clone's actions are perfectly correlated with your own, you have no good reason to treat the clone as an independent agent in your analysis. The standard tools of game theory, designed to deal with cases involving multiple independent agents, are no longer relevant. Instead, treat the clone as if he were part of the world-state in a standard single-agent decision problem, except this is a part of the world-state about which your actions give you information (kind of like whether or not you're a psychopath in the button case).

Comment author: James_Miller 13 September 2014 10:44:59PM 1 point [-]

I agree with your first paragraph.

Imagine you are absolutely certain you will cooperate and that your clone will cooperate. You are still capable of asking "what would my payoff be if I didn't cooperate" and this payoff will be the payoff if you defect and the clone cooperates since you expect the clone to do whatever you will do and you expect to cooperate. There is no reason to update my belief on what the clone will do in this thought experiment since the thought experiment is about a zero probability event.

The psychopath case is different because I have uncertainty regarding whether I am a psychopath and the choice I want to make helps me learn about myself. I have no uncertainty concerning my clone.

Comment author: VAuroch 14 September 2014 11:39:00AM 0 points [-]

There is no reason to update my belief on what the clone will do in this thought experiment since the thought experiment is about a zero probability event.

You are reasoning about an impossible scenario; if the probability of you reaching the event is 0, the probability of your clone reaching it is also 0. In order to make it a sensical notion, you have to consider it as epsilon probabilities; since the probability will be the same for both your and your clone, this gets you , which is maximized when .

To claim that you and your clone could take different actions is trying to make it a question about trembling-hand equilibria, which violates the basic assumptions of the game.

Comment author: James_Miller 14 September 2014 02:22:07PM 1 point [-]

It's common in game theory to consider off the equilibrium path situations that will occur with probability zero without taking a trembling hand approach.