ike comments on Absolute Authority - Less Wrong

44 Post author: Eliezer_Yudkowsky 08 January 2008 03:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (72)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Wes_W 04 August 2014 08:25:16AM 0 points [-]

I think deception should be treated as a special case, here. Normally, P(X | a seemingly correct argument for X) is pretty high. When you specifically expect deception, this is no longer true.

I'm not sure it's useful to consider "what if they hack your mind" in this kind of conversation. Getting hacked isn't a Bayesian update, and hallucinations do not constitute evidence.

Comment author: ike 04 August 2014 10:43:26AM 0 points [-]

If there was a way to differentiate hallucinations from real vision, then I'd agree, but there isn't.

Anyway, I thought of a (seemingly) knockdown argument for not believing future selves: what if you currently believe at 50% that tomorrow you'll be convinced of 2+2=3, the next day 2+2=5, and the next day 2+2=6? (And that it only has one answer.) If you just blindly took those as minimums, then your total probability mass would be at least 150%. Therefore, you can only trust your current self.

Comment author: Wes_W 04 August 2014 03:31:41PM 1 point [-]

If there was a way to differentiate hallucinations from real vision, then I'd agree, but there isn't.

Sure, but that is a different problem than what I'm talking about. Expecting to hallucinate is different than expecting to receive evidence. If you expect to be actually convinced, you ought to update now. If you expect to be "convinced" by hallucination, I don't think any update is required.

Framing the 2+2=3 thing as being about deception is, IMO, failing to engage with the premise of the argument.

Anyway, I thought of a (seemingly) knockdown argument for not believing future selves: what if you currently believe at 50% that tomorrow you'll be convinced of 2+2=3, the next day 2+2=5, and the next day 2+2=6?

I would be very confused, and very worried about my ability to separate truth from untruth. In that state, I wouldn't feel very good about trusting my current self, either.