Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

TheOtherDave comments on Making Beliefs Pay Rent (in Anticipated Experiences) - Less Wrong

110 Post author: Eliezer_Yudkowsky 28 July 2007 10:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (246)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 04 February 2012 04:25:01AM 2 points [-]

Excellent question!

Excellent, because it illustrates the problem with "believing in" the principle of falsifiability, as opposed to using it and understanding how it relates to the rest of my thinking.

Forget that the principle of falsifiability is itself incredibly important. What sorts of beliefs does the principle of falsifiability tell me to increase my confidence in? To decrease my confidence in?

What would the world have to be like for the former beliefs to be in general less likely than the latter?

Comment author: Ab3 04 February 2012 09:51:24PM 0 points [-]

Thanks for the reply Dave. Are you saying I should not look at falsifiability as a belief, but rather a tool of some sort? That distinction sounds interesting but is not 100% clear to me. Perhaps someone should do a larger post about why the principle should not be applied to itself.

I have also thought of putting the problem this way: Eliezer states that the only ideas worth having are the ones we would be willing to give up. Is he willing to give up that idea? I don't think so..., and I would be really interested to know why he doesn't believe this to be a contradiction.

Comment author: TheOtherDave 05 February 2012 01:55:24AM 2 points [-]

What I'm saying is that the important thing is what I can do with my beliefs. If the "principle of falsifiability" does some valuable thing X, then in worlds where the PoF doesn't do X, I should be willing to discard it. If the PoF doesn't do any valuable thing X, then I should be willing to discard it in this world.

Comment author: Ab3 09 February 2012 06:53:00PM 0 points [-]

It seems we have empirical and non-empirical beliefs that can both be rational, but what we mean by “rational” has a different sense in each case. We call empirical beliefs “rational” when we have good evidence for them, we call non-empirical beliefs like the PoF “rational” when we find that they have a high utility value, meaning there is a lot we can do with the principle (it excludes maps that can’t conform to any territory).

To answer my original question, it seems a consequence of this is that the PoF doesn’t apply to itself, as it is a principle that is meant for empirical beliefs only. Because the PoF is a different kind of belief from an empirical belief, it need not be falsifiable, only more useful than our current alternatives. What do you think about that?

Comment author: TheOtherDave 09 February 2012 11:28:16PM 1 point [-]

I think it depends on what the PoF actually is.

If it can be restated as "I will on average be more effective at achieving my goals if I only adopting falsifiable beliefs," for example, then it is equivalent to an empirical belief (and is, incidentally, falsifiable).

If it can be restated as "I should only adopt falsifiable beliefs, whether doing so gets me anything I want or not" then there exists no empirical belief to which it is equivalent (and is, incidentally, worth discarding).