Tyrrell_McAllister comments on What Are Probabilities, Anyway? - Less Wrong

22 Post author: Wei_Dai 11 December 2009 12:25AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (78)

You are viewing a single comment's thread.

Comment author: Tyrrell_McAllister 11 December 2009 03:03:29AM *  0 points [-]

All possible worlds are real, and probabilities represent how much I care about each world.

Could you elaborate on what it means to have a given amount of "care" about a world? For example, suppose that I assign (or ought to assign) probability 0.5 to a coin's coming up heads. How do you translate this probability assignment into language involving amounts of care for worlds?

Comment author: Zack_M_Davis 11 December 2009 03:09:58AM 7 points [-]

You care equally for your selves that see heads and your selves that see tails. If you don't care what happens to you after you see heads, then you would assign probability one to tails. Of course, you'd be wrong in about half the worlds, but hey, no skin off your nose. You're the one who sees tails. Those other guys ... they don't matter.

Comment author: timtyler 12 December 2009 09:10:56AM *  3 points [-]

A bizarre interpretation.

For example, caring about "living until tomorrow" does not normally mean assigning a zero probability to death in the interim. If anything that would tend to make you fearless - indifferent to whether you stepped in front of a bus or not - the very opposite of what we normally mean by "caring" about some outcome.

Comment author: Tyrrell_McAllister 11 December 2009 04:46:58AM 1 point [-]

Thanks. That makes it a lot clearer.

It seems like this "caring" could be analyzed a lot more, though. For example, suppose I were an altruist who continued to care about the "heads" worlds even after I learned that I'm not in them. Wouldn't I still assign probability ~1 to the proposition that the coin came up tails in my own world? What does that probability assignment of ~1 mean in that case?

I suppose the idea is that a probability captures not only how much I care about a world, but also how much I think that I can influence that world by acting on my values.

Comment author: Wei_Dai 11 December 2009 11:23:47PM 0 points [-]

See http://lesswrong.com/lw/15m/towards_a_new_decision_theory/ for more details. Many of my later posts can be considered explanations/justifications for the "design choices" I made in that post.