Unknowns comments on The Smoking Lesion: A problem for evidential decision theory - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (100)
Do you think that in real life you are exempt from the laws of physics?
If not, does that mean that "what you will do is already decided"? That you don't have to make a decision? That you are "incapable of rationality"?
In the real world the information that determines my action is contained within me. In order to determine the action, you would have to run "me" (or at least some reasonable part thereof)
In your version of newcombs the information that determines my action is contained within the machine.
Can you see why I consider that a significant difference?
No. The machine determines your action only by determining what is in you, which determines your action in the normal way.
So you still have to decide what to do.
Do you see how this scenario rules out the possibility of me deciding rationally?
EDIT: In fact, let me explain now, before you answer, give me a sec and I'll re-edit
EDIT2: If the rational decision is to two-box, and Omega has set me to one-box, then I must not be deciding rationally. Correct?
If the rational decision is to one-box, and Omega has set me to two-box, then I must not be deciding rationally. Correct?
Now, assuming I will not decide rationally, as I know I will not, I need waste no time thinking. I'll do whichever I feel like.
You can substitute "the laws of physics" for "Omega" in your argument, and if it proves you will not decide rationally in the Omega situation, then it proves you will not decide --anything-- rationally in real life.
Presumably (or at least hopefully) if you are a rational agent with a certain DT, then a long and accurate description of the ways that "the laws of physics" affect your decision-making process break down into
It's not clear how a reduction like this could work in your example.
In my example, it is give that Omega decides what you are going to do, but that he causes you to do it in the same way you ordinarily do things, namely with some decision theory and by thinking some thoughts etc.
If the fact that Omega causes it means that you are irrational, then the fact that the laws of physics cause your actions also means that you are irrational.
A rational entity can exist in the laws of physics. A rational entity by definition has a determined decision, if there is a rational decision possible. A rational entity cannot make an irrational decision.
You're getting hung up on the determinism. That's not the issue. Rational entities are by definition deterministic.
What they are not is deterministically irrational. Your scenario requires an irrational entity.
Your scenario requires that the entity be able to make an irrational decision, using it's normal thought processes. This requires that it be using irrational thought processes.
It seems you are simply assuming away the problem. Your assumptions:
Then, the described scenario is simply inconsistent, if Omega can use a rational entity as a subject. And so it comes down to which bullet you want to bite. Is it:
I'm somewhat willing to grant A, B, or D, and less apt to grant C or E.
I'm not sure if you have an objection thus far that this does not encapsulate.
D doesn't make sense to me. If they make their decisions rationally, that shouldn't result in an irrational act at any point. If rational decision-making can result in irrational decisions we have a contradiction.
C. would not have to be true for all entities, just rational ones; which seems entirely possible.
But I still hold with something very similar to B.
There isn't a real choice. What you will do has been decided from outside you, and no matter how much you think you're not going to change that.
I was simply attempting to show that it is irrelevant to talk about what you should, rationally, do in the scenario, because the scenario doesn't allow rational choice. It doesn't actually allow choice at all, but that's harder to demonstrate than demonstrating that it doesn't allow rational choice.
Apparently I'm not doing a very good job of it.
No, it proves I will not decide everything rationally if I don't decide everything rationally. Which is pretty tautologous.
The Omega example requires that I will not decide everything rationally.
The real world permits the possibility of a rational agent. Thus it makes sense to question what a rational agent would do. Your scenario doesn't permit a rational agent, thus it makes no sense to ask what a rational agent would do.
You're missing the point Unknowns. In your scenario, my decision doesn't depend on how I decide. It just depends on the setting of the box. So I might as well just decide arbitrarily, and save effort.
What would you do in your own scenario?
In real life, your decision doesn't depend on how you decide it. It just depends on the positions of your atoms and the laws of physics. So you might as well just decide arbitrarily, and save effort.
I would one-box.
So, if Omega programmed you to two-box, you would one-box?
That's not exactly consistent. In fact, that's logically impossible.
Essentially, you're denying your own scenario.
You left out some steps in your argument. It appears you were going for a disjunction elimination, but if so I'm not convinced of one premise. Let me lay out more explicitly what I think your argument is supposed to be, then I'll show where I think it's gone wrong.
A = "The rational decision is to two-box" B = "Omega has set me to one-box" C = "The rational decision is to one-box" D = "Omega has set me to two-box" E = "I must not be deciding rationally"
I'll grant #1 and #2. This is a valid argument, but the dubious proposition is #3. It is entirely possible that (A∧D) or that (C∧B). And in those cases, E is not guaranteed.
In short, you might decide rationally in cases where you're set to one-box and it's rational to one-box.
Proposition 3 is only required to be possible, not to be true, and is supported by the existence of both paths of the scenario: the scenario requires that both A and B are possible.
It is possible that I will make the rational decision in one path of the scenario. But the scenario contains both paths. In one of the two paths I must be deciding irrationally.
Given as it was stated that I will use my normal thought-processes in both paths, my normal thought-processes must, in order for this scenario to be possible, be irrational.
You're mixing modes.
It is not the case that in order for this scenario to be possible, your normal thought-processes must be necessarily irrational. Rather, in order for this scenario to be possible, your normal thought-processes must be possibly irrational. And clearly that's the case for normal non-supernatural decision-making.
ETA: Unknowns stated the conclusion better
Let's try a different tack: Is it rational to decide rationally in Unknown's scenario?
1.Thinking takes effort, and this effort is a disutility. (-c)
2.If I don't think I will come to the answer the machine is set to. (of utility X)
3.If I do think I will come to the answer the machine is set to. (of utility X)
My outcome if I don't think is "X" My outcome if I do think if "X-c" Which is less than "X" I shouldn't waste my effort thinking this through.
If you did not know about the box, you'd experience your normal decision-making apparatus output a decision in the normal way. Either you're the sort of person who generally decides rationally or not, and if you're a particularly rational person the box might have to make you do some strange mental backflips to justify the decision in the case that it's not rational to make the choice the box specifies.
It is isomorphic, in this sense, to the world determining your actions, except that you'll get initial conditions that are very strange, in half the times you play this game (assuming a 50% chance of either outcome).
If you know about the box, then it becomes simpler, as you will indeed be able to use this reasoning and the box will probably just have to flip a bit here or there to get you to pick one or the other.
If you're not the sort of person who usually decides rationally, then following your strategy should be easy. For me, I anticipate that I would decide rationally half the time, and go rather insane the other half (assuming there was a clear rational decision, as you implied above).