You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

magfrump comments on Group rationality diary, 8/6/12 - Less Wrong Discussion

6 Post author: cata 08 August 2012 05:58AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (36)

You are viewing a single comment's thread. Show more comments above.

Comment author: magfrump 11 August 2012 05:32:16PM 0 points [-]

Whence this should? That is my point.

I want a description of my expected future experiences; if that means that I have expected variables in it rather than forks in a road, that actually makes it better because the "fork in the road" metaphor is agenty whereas the "random variable" metaphor is uncontrollable.

Comment author: Vladimir_Nesov 11 August 2012 06:07:04PM *  1 point [-]

I want a description of my expected future experiences

For what purpose? Decision-theoretically, what matters is consequences, not experiences.

Comment author: magfrump 14 August 2012 08:41:53PM 0 points [-]

I can imagine purposes for which envisioning multiple different hypotheticals is useful for decision-making, so I will concede this point. My original opinion was simply that I have different criteria for what makes me sleep better at night than I thought I did, anyway.

Comment author: TheOtherDave 11 August 2012 09:55:34PM 0 points [-]

Decision-theoretically, what matters is consequences, not experiences.

I'm confused by this distinction. Can you give me an example of an experience that is not a consequence and therefore doesn't matter decision-theoretically? Can you give me an example of a consequence that is not an experience and therefore matters decision-theoretically?

Comment author: Vladimir_Nesov 11 August 2012 10:41:05PM *  3 points [-]

For example, if you make a decision and then die, there will be consequences, but no future experiences. While future experiences are part of consequences, they don't paint a balanced picture, as (predictable) things outside experiences are going to happen as well. You can send $X to charity, and expected consequences will predictably depend on specific (moderate) value of X, but you won't expect differing future experiences depending on X.

Comment author: TheOtherDave 11 August 2012 11:18:36PM 1 point [-]

Gotcha! Sure, that makes sense. Thanks.