Alicorn comments on The ideas you're not ready to post - Less Wrong

24 Post author: JulianMorrison 19 April 2009 09:23PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (253)

You are viewing a single comment's thread.

Comment author: Alicorn 19 April 2009 10:11:55PM 1 point [-]

I have more to say about my cool ethics course on weird forms of utilitarianism, but unlike with Two-Tier Rationalism, I'm uncertain of how germane the rest of these forms are to rationalism.

I have a lot to say about the Reflection Principle but I'm still in the process of hammering out my ideas regarding why it is terrible and no one should endorse it.

Comment author: SoullessAutomaton 19 April 2009 10:41:02PM 2 points [-]

I have a lot to say about the Reflection Principle but I'm still in the process of hammering out my ideas regarding why it is terrible and no one should endorse it.

I'm not sure what Reflection Principle you're referring to here. Google suggests two different mathematical principles but I'm not seeing how either of those would be relevant on LW, so perhaps you mean something else?

Comment author: Alicorn 20 April 2009 02:04:04AM *  2 points [-]

The Reflection Principle, held by some epistemologists to be a constraint on rationality, holds that if you learn that you will believe some proposition P in the future, you should believe P now. There is complicated math about what you should do if you have degree of credence X in the proposition that you will have credence Y in proposition P in the future and how that should affect your current probability for P, but that's the basic idea. An alternate formulation is that you should treat your future self as a general expert.

Comment author: SoullessAutomaton 20 April 2009 09:26:27AM 1 point [-]

Reminds me a bit of the LW (ab)use of Aumann's Agreement Theorem, heh--at least with a future self you've got a high likelihood of shared priors.

Anyway, I know arguments from practicality are typically missing the point in philosophical arguments, but this seems to be especially useless--even granting the principle, under what circumstance could you become aware of your future beliefs with sufficient confidence to change your current beliefs based on such?

It seems to boil down mostly to "If you're pretty sure you're going to change your mind, get it over with". Am I missing something here?

Comment author: Alicorn 20 April 2009 02:59:54PM 1 point [-]

Well, that's one of my many issues with the principle - it's practically useless, except in situations that it has to be formulated specifically to avoid. For instance, if you plan to get drunk, you might know that you'll consider yourself a safe driver while you are (in the future) drunk, but that doesn't mean you should now consider your future, drunk self a safe driver. Sophisticated statements of Reflection explicitly avoid situations like this.

Comment author: JulianMorrison 20 April 2009 03:11:03AM -1 points [-]

Well that's pretty silly. You wouldn't treat your present self as a general expert.

Comment author: Alicorn 20 April 2009 03:01:07PM *  2 points [-]

Wouldn't you? You believe everything you believe. If you didn't consider yourself a general expert, why wouldn't you just follow around somebody clever and agree with them whenever they asserted something? And even then, you'd be trusting your expertise on who was clever.

Comment author: gwern 21 April 2009 03:15:24AM 0 points [-]