Sniffnoy comments on Explaining information theoretic vs thermodynamic entropy? - Less Wrong

-3 Post author: wedrifid 04 November 2010 11:41PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (13)

You are viewing a single comment's thread.

Comment author: Sniffnoy 05 November 2010 12:29:30AM *  5 points [-]

haven't got it backwards and a string of all '1's has nearly zero entropy while a perfectly random string is 'maximum entropy'.

Ugh. I realize you probably know what you are talking about, but I expect a category error like this is probably not going to help you explain it...

Edit: Actually, I suppose that sort of thing is not really a problem if they're used to the convention where "a random X" means "a probability distribution over Xs", but if you're having to introduce information entropy I expect that's probably not the case. The real problem is that the string of all 1s is a distractor - it will make people think the fact that it's all 1s is relevant, rather than just the fact that it's a fixed string.

Edit once again: Oh, did you mean Kolmogorov complexity? Then never mind. "Entropy" without qualification usually means Shannon entropy.