cousin_it comments on Explaining information theoretic vs thermodynamic entropy? - Less Wrong

-3 Post author: wedrifid 04 November 2010 11:41PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (13)

You are viewing a single comment's thread.

Comment author: cousin_it 05 November 2010 01:55:13PM *  4 points [-]

Sniffnoy's comment has it right. Information-theoretic (Shannon) entropy is very similar to thermodynamic entropy and they don't contradict each other as you seem to think. They don't talk about individual bit strings (aka microstates, aka messages), but rather probability distributions over them. See this wikipedia page for details. If you have in mind a different notion of entropy based on algorithms and Kolmogorov complexity, you'll have to justify its usefulness to your physicist friends yourself, and I'm afraid you won't find much success. I don't have much use for K-complexity myself, because you can't make actual calculations with it the way you can with Shannon entropy.

Comment author: wedrifid 06 November 2010 12:09:55PM 1 point [-]

Sniffnoy's comment has it right. Information-theoretic (Shannon) entropy is very similar to thermodynamic entropy and they don't contradict each other as you seem to think.

No, I don't think that they contract each other, and that isn't suggested by my words. I merely refrain from holding my associates in contempt for this particular folly since it is less intuitive than other concepts of equivalent levels complexity.

Comment author: cousin_it 06 November 2010 03:42:48PM 1 point [-]

Wait, what exactly is their folly? I'm confused.