amcknight comments on Thoughts and problems with Eliezer's measure of optimization power - Less Wrong

17 Post author: Stuart_Armstrong 08 June 2012 09:44AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (23)

You are viewing a single comment's thread.

Comment author: amcknight 08 June 2012 09:09:44PM 1 point [-]

If OP were an entropy, then we'd simply do a weighted sum 1/2(OP(X4)+OP(X7))=1/2(1+3)=2, and then add one extra bit of entropy to represent our (binary) uncertainty as to what state we were in, giving a total OP of 3.

I feel like you're doing something wrong here. You're mixing state distribution entropy with probability distribution entropy. If you introduce mixed states, shouldn't each mixed state be accounted for in the phase space that you calculate the entropy over?

Comment author: Stuart_Armstrong 10 June 2012 09:06:07AM 1 point [-]

If you down the "entropy is ignorance about the exact microstate" route, this makes perfect sense. And various people have made convincing sounding arguments that this is the right way to see entropy, though I'm not expert myself.

Comment author: amcknight 11 June 2012 06:51:47PM 0 points [-]

I'm not an expert either. However, the OP function has nothing to do with ignorance or probabilities until you introduce them in the mixed states. It seems to me that this standard combining rule is not valid unless you're combining probabilities.

Comment author: Stuart_Armstrong 12 June 2012 04:38:59PM 0 points [-]

Hence OP is not an entropy.