Vladimir_Nesov comments on The Blackmail Equation - Less Wrong

13 Post author: Stuart_Armstrong 10 March 2010 02:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (87)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vladimir_Nesov 10 March 2010 08:53:35PM *  1 point [-]

I don't propose fooling anyone, signaling is most effective when it's truthful.

What could it mean to "make a precommitment", if not to signal the fact that your strategy is a certain way? You strategy either is, or isn't a certain way, this is a fixed fact about yourself, facts don't change. This being apparently the only resolution, I was not so much correcting as elucidating what you were saying (but assuming you didn't think of this elucidation explicitly), in order to make the conclusion easier to see (that the problem is with inability to signal counterfactual aspects of the strategy).

Comment author: FAWS 10 March 2010 09:10:26PM *  1 point [-]

I don't propose fooling anyone, signaling is most effective when it's truthful.

Signaling is about perceptions, not the truth by necessity. That means that fooling is at least a hypothetical possibility. Which is not the case for my use of precommittment.

What could it mean to "make a precommitment", if not to signal the fact that your strategy is a certain way?

Taking the decision not to change your mind later in a way you will stick to. If as you seem to suggest the question whether the agent later acts a certain way or not is already implicit in its original source code then this agent already comes into existence precommitted (or not, as the case may be).

Comment author: Vladimir_Nesov 10 March 2010 09:30:05PM *  2 points [-]

Taking the decision not to change your mind later in a way you will stick to.

That you've taken this decision is a fact about your strategy (as such, it's timeless: looking at it from ten years ago doesn't change it). There is a similar fact of what you'd do if the situation was different.

Did you read about counterfactual mugging, and do you agree that one should give up the money? No precommitment in this sense could help you there: there is no explicit decision in advance, it has to be a "passive" property of your strategy (the distinction between a decision that was "made" and that wasn't is superficial one -- that's my point).

If as you seem to suggest the question whether the agent later acts a certain way or not is already implicit in its original source code then this agent already comes into existence precommitted (or not, as the case may be).

How could it be otherwise? And if so, "deciding to precommit" (in the sense of fixing this fact at a certain moment) is impossible in principle. All you can do is tell the other player about this fact, maybe only after you yourself discovered it (as being the way to win, and so the thing to do, etc.)

Comment author: FAWS 10 March 2010 09:40:59PM *  1 point [-]

That you've taken this decision is a fact about your strategy (as such, it's timeless: looking at it from ten years ago doesn't change it). There is a similar fact of what you'd do if the situation was different.

Yes, its a fact about your strategy, but this particular strategy would not have been your strategy before making that decision (it may have been a strategy you were considering, though). Unless you want to argue that there is no such thing as a decision, which would be a curious position in the context of a thought experiment about decision theory.

Did you read about counterfactual mugging, and do you agree that one should give up the money?

Yes, I considered myself precommitted to hand over the money when reading that. I would not have considered myself precommmitted before my speculations about time travel a couple of years ago, and if I had read the scenario of the counterfactual mugging and nothing else here, and if I had been forced to say whether I would hand over the money without time to think it though I would have said that I would not (I can't tell what I would have said given unlimited time).

Comment author: Vladimir_Nesov 10 March 2010 09:56:34PM 0 points [-]

Yes, I considered myself precommitted to hand over the money when reading that. I would not have considered myself precommmitted before my speculations about time travel a couple of years ago, and if I had read the scenario of the counterfactual mugging and nothing else here, and if I had been forced to say whether I would hand over the money without time to think it though I would have said that I would not (I can't tell what I would have said given unlimited time).

Would it make a difference if Omega told you that it tossed the coin a thousand years ago (before you've "precommited"), but only came for the money now?

Comment author: FAWS 10 March 2010 10:03:05PM 2 points [-]

That would make no difference whatsoever of course. Only the time I learn about the mugging matters.

Comment author: Vladimir_Nesov 10 March 2010 10:17:58PM 0 points [-]

But the coin precommited to demand the money from you first. How do you reconcile this with your position about the order of precommitments?

Comment author: FAWS 10 March 2010 10:25:47PM 1 point [-]

Are you trying to make fun of me?

Comment author: Vladimir_Nesov 10 March 2010 10:37:38PM *  1 point [-]

No, a serious question. I was referring to the discussion starting from the top-level comment here (it's more of praise's position -- my mistake for confusing this -- it's unclear whether you agree).

Comment author: FAWS 10 March 2010 10:54:34PM *  0 points [-]

"Who precommits first wins" means that if one party can make the other party learn about its precommitment before the other party can commit the first party wins. Not because commitment has magical powers that vary with time, but because learning about the precommitment makes making an exception in just this one case "rational" (if it's not "rational" to you you already had implicitly precommmitted).

Comment author: Vladimir_Nesov 10 March 2010 09:57:14PM *  -1 points [-]

Yes, its a fact about your strategy, but this particular strategy would not have been your strategy before making that decision.

Determinism doesn't allow such magic. You need to read up on free will.

Comment author: FAWS 10 March 2010 10:11:02PM *  3 points [-]

Are you being deliberately obtuse?

I consider a strategy that involves killing myself in certain circumstances, but have not yet committed to it.

  • Before I can do so these circumstances suddenly arise. I chicken out and don't kill myself, because I haven't committed yet (or psyched myself up if you want to call it that). That strategy wasn't really my strategy yet.

  • 5 Minutes later I have committed myself to that strategy. The circumstances I would kill myself under arise, and I actually do it (or so I hope. I'm not completely sure I can make precommittments that strong) The strategy I previously considered is now my strategy.

How is any of that free will magic?

Comment author: Vladimir_Nesov 10 March 2010 10:29:59PM *  0 points [-]

Thanks, this explains the "would not have been your strategy" thing.

So, when you talk about "X is not my strategy", you refer to particular time: X is not the algorithm you implement at 10AM, but X is the algorithm you implement at 11AM. When you said "before I decided at 10:30AM, X wasn't my strategy", I heard "before I decided at 10:30AM, at 11AM there was no fact about which strategy I implement, but after that, there appeared a fact that at 11AM I implement X", while it now seems that you meant "at 10AM I wasn't implementing X; I decided to implement X at 10:30AM; at 11AM I implemented X". Is the disagreement resolved? (Not the original one though, of the top-level comment -- that was about facts.)

Comment author: FAWS 10 March 2010 10:39:23PM 0 points [-]

Yes. I can't see why you would interpret my position in a way that is both needlessly complicated (taking "before" to be a statement about some sort of meta-time rather than just plain normal time?) and doesn't make any sense whatsoever, though.

Comment author: Vladimir_Nesov 10 March 2010 11:00:31PM *  0 points [-]

Well, it's a common failure mode, you should figure out some way of signalling that you don't fall in it (and I should learn to ask the right questions). Since you can change your mind about what to do at 11AM, it's appealing to think that you can also change the fact of the matter of what happens at 11AM. To avoid such confusion, it's natural enough to think about "the algorithm you implement at 10AM" and "the algorithm you implement at 11AM" as unrelated facts that don't change (but depend and are controlled by particular systems, such as your source code at given time, or even "acausally", or "logically" controlled by the algorithms in terms of which they are defined).

Comment author: Vladimir_Nesov 10 March 2010 09:44:55PM 0 points [-]

Signaling is about perceptions, not the truth by necessity.

Any evidence, that is any way in which you may know facts about the world, is up to interpretation, and you may err in interpreting it. But it's also the only way to observe the truth.

Comment author: FAWS 10 March 2010 09:51:25PM 1 point [-]

You are talking about the relation between truth and your own perceptions. None of this is relevant for the relation between truth and what you want other peoples perceptions to be, which is the context those words are used in the post you reply to. Are you deliberately trying to misinterpret me? Do I need to make all of my posts lawyer-proof?

Comment author: Vladimir_Nesov 10 March 2010 10:04:24PM 0 points [-]

Are you deliberately trying to misinterpret me?

No.

You are talking about the relation between truth and your own perceptions. None of this is relevant for the relation between truth and what you want other peoples perceptions to be, which is the context those words are used in the post you reply to.

The other people will interpret your words depending on whether they expect them to be in accordance with reality. Thus, I'm talking about the relation between the way your words will be interpreted by the people you talk to, and the truth of your words. If signaling (communication) bore no relation to the truth, it would be as useless as listening to white noise.

Comment author: FAWS 10 March 2010 10:22:56PM 1 point [-]

You're doing it again. I never said that signaling bore no relationship to the truth whatsoever, I said it was about perceptions and not by necessity about the truth, and what I (obviously, it seemed to me) meant was that signaling means attempting to manipulate the perceptions of others in a certain way, and that this does not necessarily mean changing the reality of the thing these perceptions are about.

Comment author: Vladimir_Nesov 10 March 2010 10:40:20PM *  0 points [-]

You can't change reality... You can only make something change in time, but every instant, as well as the whole shape of the process of change, are fixed facts.

By signalling I mean, for example, speaking (though the term fits better in the original game). Of course, you are trying to manipulate the world (in particular, perceptions of other people) in a certain way by your actions, but it's a general property shared by all actions.

Comment author: FAWS 10 March 2010 10:45:41PM *  2 points [-]

You can't change reality in this meta-time sort of sense you seem to be eager to assign me. If I take a book out of the book case and put it on my desk I have changed the reality of where that book is. I haven't changed the reality of where that book will be in 2 minutes in your meta-time sense through my magical free will powers at the meta-time of making the decision to do that, but have changed the reality of where that book is in the plain English sense.

EDIT: You edited your post while I was replying. I only saw the first sentence.

Comment author: Vladimir_Nesov 10 March 2010 11:04:49PM 0 points [-]

Agreed.

Comment author: wedrifid 10 March 2010 09:14:00PM 0 points [-]

this is a fixed fact about yourself, facts don't change.

What I was 10 years ago is a fixed fact about what I was 10 years ago. That doesn't change. But I have.

Comment author: Vladimir_Nesov 10 March 2010 09:22:18PM 0 points [-]

So? (Not a rhetorical question.)

Comment author: wedrifid 10 March 2010 09:33:29PM *  0 points [-]

The point is that it is not a fixed fact about yourself unless you have an esoteric definition of self that is "what I was, am or will be at one particular instant in time". Under the conventional meaning of 'yourself', you can change and do so constantly. Essentially the 'So?' is a fundamental rejection of the core premise of your comment.

(We disagree about a fundamental fact here. It is a fact that appears trivial and obvious to me and I assume your view appears trivial and obvious to you. It doesn't seem likely that we will resolve this disagreement. Do you agree that it would be best for us if we just leave it at that? You can, of course, continue the discussion with FAWS who on this point at least seems to have the same belief as I.)

Comment author: Vladimir_Nesov 10 March 2010 10:12:52PM *  0 points [-]

Also, you shouldn't agree with the statement I cited here. (At least, it seems to be more clear-cut than the rest of the discussion.) Do you?

Comment author: wedrifid 10 March 2010 10:26:23PM 0 points [-]

I agree with the statement of FAWS' that you quoted there. Although I do note that FAWS' statement is ambiguous. I only agree with it to the extent that the meaning is this:

Yes, its a fact about your strategy, but this particular strategy would not have been your strategy before making [the decision to precommit which involved some change in the particles in the universe such that your new state is one that will take a certtain action in a particular circumstance].

Comment author: Vladimir_Nesov 10 March 2010 10:34:08PM 0 points [-]

Still ambiguous, and hints at non-lawful changes, though likely not at all intended. It's better to merge in this thread (see the disambiguation attempt).

Comment author: Vladimir_Nesov 10 March 2010 09:50:16PM *  0 points [-]

What is the fact about which you see us disagreeing? I don't understand this discussion as having a point of disagreement. From my point of view, we are arguing relevance, not facts. (For example, I don't see why it's interesting to talk of "Who this fact is about?", and I've lost the connection of this point to the original discussion.)

Comment author: wedrifid 10 March 2010 10:01:03PM *  0 points [-]

What is the fact about which you see us disagreeing?

  1. You can modify your source code.
  2. You can make precommitments.
  3. "What could it mean to "make a precommitment"" is 'make a precommitment'. That is a distinct thing and 'signalling that you have made a precommitment". (If you make a precommitment and do not signal it effectively then it sucks for you.)
  4. More simply - on the point on which you were disagreeing with FAWS (I assert that)
    • FAWS' position does have meaning.
    • FAWS' meaning is a different meaning to what you corrected it to.
    • FAWS is right.

I don't understand this discussion as having a point of disagreement. From my point of view, we are arguing definitions, not facts.

It is probably true that we would make the same predictions about what would happen in given interactions between agents.

Comment author: Vladimir_Nesov 10 March 2010 10:11:45PM 0 points [-]
  1. You can modify your source code.

Sure, why not?

  1. You can make precommitments.
  2. "What could it mean to "make a precommitment"" is 'make a precommitment'.

Not helping!

That is a distinct thing and 'signalling that you have made a precommitment". (If you make a precommitment and do not signal it effectively then it sucks for you.)

Of course, having a strategy that behaves in a certain way and signaling this fact are different things. It isn't necessarily a bad thing to hide something (especially from a jumble of wires that distinguishes your thoughts and not just actions as terminal value).

  1. More simply - on the point on which you were disagreeing with FAWS (I assert that)
  2. FAWS' position does have meaning.
  3. FAWS' meaning is a different meaning to what you corrected it to.
  4. FAWS is right.

Not helping!

Comment author: wedrifid 10 March 2010 10:34:27PM 0 points [-]

Not helping!

No, it is not. You asked (with some implied doubt) where we disagree. I answered as best I could. As I stated, we are probably not going to resolve our disagreement so I will leave it at that, with no disrespect intended beyond, as Robin often describes, the inevitable disrespect implicit in the actual fact of disagreement.

Comment author: Vladimir_Nesov 10 March 2010 10:48:35PM 0 points [-]

The "Not helping!" parts didn't explain where we disagree (what are the facts I believe are one way and you believe are the other way), they just asserted that we do disagree.

But the last sentence suggests that we disagree about the definition of disagreement, because how could we disagree if you concede that

It is probably true that we would make the same predictions about what would happen in given interactions between agents.