Wes_W comments on Absolute Authority - Less Wrong

44 Post author: Eliezer_Yudkowsky 08 January 2008 03:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (72)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: ike 03 August 2014 04:53:44AM 0 points [-]

For technical reasons of probability theory, if it's theoretically possible for you to change your mind about something, it can't have a probability exactly equal to one.

This is supposed to be an argument against giving anything an 100% probability. I do agree with the concept, but this particular argument seems wrong. It's based on Conservation of Expected Evidence (if the "technical reasons of probability theory" refer to something else, let me know). However, the Bayes rule doesn't just imply that "having a chance of changing your mind" -> "you are not 100% certain", it also gives us bounds on what posteriors we can have. If we evaluate a 5% chance to changing our minds on something, that would seem to imply that we cannot put a >95% in our original claim.

So, the reason I reject this is as follows:

EY lays out possible evidence for 2+2=3 here. Imagine you believe at 50% level that someone will cause you to view that evidence tomorrow. Hypnosis, or some other method. Applying Bayes rule like EY seems to be applying it here, you should evaluate right now at most a 50% chance that 2+2=4. I think the rational thing to do in that situation (where putting the earplugs together does in fact show 2+2 equaling 4), is to believe that 2+2=4, with around the same much confidence as you do now. Therefore, there is something wrong with this line of reasoning.

If anyone can point to what I'm doing wrong, or thinks that in the situation I outlined, the rational thing to do is to evaluate a 50% or lower chance of 2+2=4, I'd like to hear about it.

Comment author: Wes_W 03 August 2014 07:12:07PM 1 point [-]

EY lays out possible evidence for 2+2=3 here. Imagine you believe at 50% level that someone will cause you to view that evidence tomorrow. Hypnosis, or some other method. Applying Bayes rule like EY seems to be applying it here, you should evaluate right now at most a 50% chance that 2+2=4. I think the rational thing to do in that situation (where putting the earplugs together does in fact show 2+2 equaling 4), is to believe that 2+2=4, with around the same much confidence as you do now.

Why do you think that is the correct thing to do in that situation?

Here, in this real situation, yes you should trust your current counting abilities. But if you believe with 50% confidence that, within 24 hours, someone will be able to convince you that your ability to count is fundamentally compromised, you also don't place a high level of confidence on your ability to count things correctly - no more than 50%, in fact.

"I can count correctly" and "[someone can demonstrate to me that] I'm counting incorrectly" are mutually exclusive hypotheses. Your confidence in the two ought not to add up to more than 1.

Comment author: ike 03 August 2014 09:23:08PM 1 point [-]

If I know that I'll actually experience that scenario tomorrow where I wake up and have all available evidence showing that 2+2=3, but now I still visualize XX+XX=XXXX, then I trust my current vast mound of evidence over a future smaller weird mound of evidence. I'm not evaluating "what will I think 2+2= tomorrow?" (as EY points out elsewhere, this kind of question is not too useful). I'm evaluating "what is 2+2?" For that, it seems irrational to trust future evidence when I might be in an unknown state of mind. The sentence EY has repeated "Those who dream do not know they dream; but when you wake you know you are awake", seems appropriate here. Just knowing that I will be convinced, however the means, is not the same as actually convincing me. What if they hack your mind and insert false memories? If you would know someone would do that tomorrow, would you think that the future memories actually happened in your past?

If you're trying to make the argument that "since someone can fool me later, I can be fooled now and wouldn't notice", well, first of all, that doesn't seem to be the argument EY is making. Second, I might have to be in such a situation to be precise, but I'd expect the future that I am being fooled in would have to delete the memory of this sequence of posts (specifically the 2+2=3 post, and this series of comments). The fact that I remember seems to point to the editing/hacking not happening yet.

After thinking of this I see that an intruder would just change all the references from 2+2=4 to 2+2=3 and vice versa, leaving me with the same logic to justify my belief in 2+2=3. So that didn't work.

How about this: once I have to consider my thought processes hacked, I can't unwind past that anyway, so to keep sane I'll have to assume my current thoughts are not corrupted.

Comment author: Wes_W 04 August 2014 08:25:16AM 0 points [-]

I think deception should be treated as a special case, here. Normally, P(X | a seemingly correct argument for X) is pretty high. When you specifically expect deception, this is no longer true.

I'm not sure it's useful to consider "what if they hack your mind" in this kind of conversation. Getting hacked isn't a Bayesian update, and hallucinations do not constitute evidence.

Comment author: ike 04 August 2014 10:43:26AM 0 points [-]

If there was a way to differentiate hallucinations from real vision, then I'd agree, but there isn't.

Anyway, I thought of a (seemingly) knockdown argument for not believing future selves: what if you currently believe at 50% that tomorrow you'll be convinced of 2+2=3, the next day 2+2=5, and the next day 2+2=6? (And that it only has one answer.) If you just blindly took those as minimums, then your total probability mass would be at least 150%. Therefore, you can only trust your current self.

Comment author: Wes_W 04 August 2014 03:31:41PM 1 point [-]

If there was a way to differentiate hallucinations from real vision, then I'd agree, but there isn't.

Sure, but that is a different problem than what I'm talking about. Expecting to hallucinate is different than expecting to receive evidence. If you expect to be actually convinced, you ought to update now. If you expect to be "convinced" by hallucination, I don't think any update is required.

Framing the 2+2=3 thing as being about deception is, IMO, failing to engage with the premise of the argument.

Anyway, I thought of a (seemingly) knockdown argument for not believing future selves: what if you currently believe at 50% that tomorrow you'll be convinced of 2+2=3, the next day 2+2=5, and the next day 2+2=6?

I would be very confused, and very worried about my ability to separate truth from untruth. In that state, I wouldn't feel very good about trusting my current self, either.

Comment author: CCC 04 August 2014 10:10:11AM 0 points [-]

"I can count correctly" and "[someone can demonstrate to me that] I'm counting incorrectly" are mutually exclusive hypotheses. Your confidence in the two ought not to add up to more than 1.

Not entirely. It is possible that someone may be able to provide a convincing demonstration of an untrue fact; either due to deliberate deception, or due to an extremely unlikely series of coincidences, or due to the person giving the demonstration genuinely but incorrectly thinking that what they are demonstrating is true.

So, there is some small possibility that I am counting correctly and someone can demonstrate to me that I am not counting correctly. The size of this possibility depends, among other things, on how easily I can be persuaded.