roland comments on How to Measure Anything - Less Wrong

50 Post author: lukeprog 07 August 2013 04:05AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (48)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 03 August 2013 01:44:05PM 20 points [-]

A measurement is an observation that quantitatively reduces uncertainty.

A measurement reduces expected uncertainty. Some particular measurement results increase uncertainty. E.g. you start out by assigning 90% probability that a binary variable landed heads and then you see evidence with a likelihood ratio of 1:9 favoring tails, sending your posterior to 50-50. However the expectation of the entropy of your probability distribution after seeing the evidence, is always evaluated to be lower than its current value in advance of seeing the evidence.

Comment author: roland 30 August 2015 11:19:15AM *  0 points [-]

The technical term for this is conditional entropy.

The conditional entropy will always be lower unless the evidence is independent of your hypothesis(in this case the conditional entropy will be equal to the prior entropy).