katydee comments on Epistemic and Instrumental Tradeoffs - Less Wrong

20 Post author: katydee 19 May 2013 07:49AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (22)

You are viewing a single comment's thread. Show more comments above.

Comment author: ThrustVectoring 19 May 2013 12:45:35PM 5 points [-]

The article seems quite incomplete without even mentioning value-of-information. Instrumental and epistemic rationality have the same goals when the VOI of learning things is positive, and opposite goals when the total VOI is negative. Now, it may be hard to capture the VOI of, say, movie spoilers and truths that are bad for you, but the typical piece of information is positive VOI. In other words, most information merely lets you make better choices, as opposed to influencing your experiences in a predicted-in-advance negative manner.

This is basically the entire reasoning for going on an information diet. Not all truths are of equal value to you, so if you can deliberately get only the high value truths, you're consistently better off.

Comment author: katydee 19 May 2013 06:11:51PM *  0 points [-]

I agree that VoI and the calculations that allow you to use it effectively are very important. However, this post serves as a basic overview and I think taking the time to explain VoI and how to calculate it wouldn't fit here.

If you think a post on VoI is necessary as a "sequel" to this one, feel free to write it-- I don't have time with my current queue of things to write-- but please link me if and when you do!

Comment author: Vaniver 19 May 2013 08:40:55PM *  7 points [-]

but please link me if and when you do!

I wrote one a while back.

Comment author: katydee 19 May 2013 11:28:54PM 0 points [-]

Thanks for the link. I'm not sure that post says exactly what I would try to say about the topic, but it is certainly interesting and useful in its own right.

Comment author: ThrustVectoring 20 May 2013 03:02:11AM 4 points [-]

I think taking the time to explain VoI and how to calculate it wouldn't fit here.

I disagree. VoI is essentially a formalized way to describe the instrumental value of figuring out how the world is (or is going to be). As such it's a very good way to relate instrumental rationality to epistemic rationality.