You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

passive_fist comments on [LINK] The Bayesian Second Law of Thermodynamics - Less Wrong Discussion

8 Post author: shminux 12 August 2015 04:52PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (16)

You are viewing a single comment's thread. Show more comments above.

Comment author: leplen 14 August 2015 12:43:18PM 0 points [-]

But if I know that all the gas molecules are in one half of the container, then I can move a piston for free and then as the gas expands to fill the container again I can extract useful work. It seems like if I know about this increase in order it definitely constitutes a decrease in entropy.

Comment author: passive_fist 15 August 2015 04:59:59AM 2 points [-]

If you know precisely when this increase in order will occur then your knowledge about the system is necessarily very high and your entropy is necessarily very low (probably close to zero) to begin with.

Comment author: leplen 15 August 2015 12:28:23PM 0 points [-]

I feel like this may be a semantics issue. I think that order implies information. To me, saying that a system becomes more ordered implies that I know about the increased order somehow. Under that construction, disorder (i.e. the absence of detectable patterns) is a measure of ignorance and disorder then is closely related to entropy. You may be preserving a distinction between the map and territory (i.e. between the system and our knowledge of the system) that I'm neglecting. I'm not sure which framework is more useful/productive.

I think it's definitely an important distinction to be aware of either way.

Comment author: passive_fist 16 August 2015 02:22:24AM *  0 points [-]

'order' is not a well-defined concept. One person's order is another's chaos. Entropy, on the other hand, is a well-defined concept.

Even though entropy depends on the information you have about the system, the way that it depends on that is not subjective, and any two observers with the same amount of information about the system must come up with the exact same quantity for entropy.

All of this might seem counter-intuitive at first but it makes sense when you realize that Entropy(system) isn't well-defined, but Entropy(system, model) is precisely defined. The 'model' is what Bayesians would call the prior. It is always there, either implicitly or explicitly.