Comment author:pragmatist
15 August 2012 10:07:55PM
*
3 points
[-]

I'm a bit skeptical of your claim that entropy is dependent on your state of knowledge; It's not what they taught me in my Statistical Mechanics class, and it's not what my brief skim of Wikipedia indicates. Could you provide a citation or something similar?

Sure. See section 5.3 of James Sethna's excellent textbook for a basic discussion (free PDF version available here). A quote:

"The most general interpretation of entropy is as a measure of our ignorance about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved quantities... This interpretation -- that entropy is not a property of the system, but of our knowledge about the system (represented by the ensemble of possibilities) -- cleanly resolves many otherwise confusing issues."

The Szilard engine is a nice illustration of how knowledge of a system can impact how much work is extractable from a system. Here's a nice experimental demonstration of the same principle (see here for a summary). This is a good book-length treatment of the connection between entropy and knowledge of a system.

Let's say you start with some prior over possible initial microstates. You can then time evolve each of these microstates separately; now you have a probability distribution over possible final microstates. You then take the entropy of the this system.

Yes, but the prior over initial microstates is doing a lot of work here. For one, it is encoding the appropriate macroproperties. Adding a probability distribution over phase space in order to make the derivation work seems very different from saying that the Second Law is provable from the fundamental laws. If all you have are the fundamental laws and the initial microstate of the universe then you will not be able to derive the Second Law, because the same microscopic trajectory through phase space is compatible with entropy increase, entropy decrease or neither, depending on how you carve up phase space into macrostates.

EDITED TO ADD: Also, simply starting with a prior and evolving the distribution in accord with the laws will not work (even ignoring what I say in the next paragraph). The entropy of the probability distribution won't change if you follow that procedure, so you won't recover the Second Law asymmetry. This is a consequence of Liouville's theorem. In order to get entropy increase, you need a periodic coarse-graining of the distribution. Adding this ingredient makes your derivation even further from a pure reduction to the fundamental laws.

In any case, it is not so clear that even the procedure you propose works. The main account of why the entropy was low in the early universe appeals to the entropy of the gravitational field as compensation for the high thermal entropy of the initial state. As of yet, I haven't seen any rigorous demonstration of how to apply the standard tools of statistical physics to the gravitational field, such as constructing a phase space which incorporates gravitational degrees of freedom. Hawking and Page attempted to do something like this (I could find you the citation if you like, but I can't remember it off the top of my head), but they came up with weird results. (ETA: Here's the paper I was thinking of.) The natural invariant measure over state space turned out not to be normalizable in their model, which means that one could not define sensible probability distributions over it. So I'm not yet convinced that the techniques we apply so fruitfully when it comes to thermal systems can be applied to universe as a whole.

Comment author:rocurley
16 August 2012 12:21:23AM
*
1 point
[-]

Also, simply starting with a prior and evolving the distribution in accord with the laws will not work (even ignoring what I say in the next paragraph). The entropy of the probability distribution won't change if you follow that procedure, so you won't recover the Second Law asymmetry. This is a consequence of Liouville's theorem. In order to get entropy increase, you need a periodic coarse-graining of the distribution. Adding this ingredient makes your derivation even further from a pure reduction to the fundamental laws.

Dang, you're right. I'm still not entirely convinced of your point in the original post, but I think I need to do some reading up in order to:

Understand the distinction in approach to the Second Law you're proposing is not sufficiently explored

See if it seems plausible that this is a result of treating physics as rules instead of descriptions.

This has been an interesting thread; I hope to continue discussing this at some point in the not super-distant future (I'm going to be pretty busy over the next week or so).

## Comments (234)

Best*3 points [-]Sure. See section 5.3 of James Sethna's excellent textbook for a basic discussion (free PDF version available here). A quote:

"The most general interpretation of entropy is as a measure of our ignorance about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved quantities... This interpretation -- that entropy is not a property of the system, but of our knowledge about the system (represented by the ensemble of possibilities) -- cleanly resolves many otherwise confusing issues."The Szilard engine is a nice illustration of how knowledge of a system can impact how much work is extractable from a system. Here's a nice experimental demonstration of the same principle (see here for a summary). This is a good book-length treatment of the connection between entropy and knowledge of a system.

Yes, but the prior over initial microstates is doing a lot of work here. For one, it is encoding the appropriate macroproperties. Adding a probability distribution over phase space in order to make the derivation work seems very different from saying that the Second Law is provable from the fundamental laws. If all you have are the fundamental laws and the initial microstate of the universe then you will not be able to derive the Second Law, because the same microscopic trajectory through phase space is compatible with entropy increase, entropy decrease or neither, depending on how you carve up phase space into macrostates.

EDITED TO ADD: Also, simply starting with a prior and evolving the distribution in accord with the laws will not work (even ignoring what I say in the next paragraph). The entropy of the probability distribution won't change if you follow that procedure, so you won't recover the Second Law asymmetry. This is a consequence of Liouville's theorem. In order to get entropy increase, you need a periodic coarse-graining of the distribution. Adding this ingredient makes your derivation even further from a pure reduction to the fundamental laws.

In any case, it is not so clear that even the procedure you propose works. The main account of why the entropy was low in the early universe appeals to the entropy of the gravitational field as compensation for the high thermal entropy of the initial state. As of yet, I haven't seen any rigorous demonstration of how to apply the standard tools of statistical physics to the gravitational field, such as constructing a phase space which incorporates gravitational degrees of freedom. Hawking and Page attempted to do something like this (I could find you the citation if you like, but I can't remember it off the top of my head), but they came up with weird results. (ETA: Here's the paper I was thinking of.) The natural invariant measure over state space turned out not to be normalizable in their model, which means that one could not define sensible probability distributions over it. So I'm not yet convinced that the techniques we apply so fruitfully when it comes to thermal systems can be applied to universe as a whole.

*1 point [-]Dang, you're right. I'm still not entirely convinced of your point in the original post, but I think I need to do some reading up in order to:

This has been an interesting thread; I hope to continue discussing this at some point in the not super-distant future (I'm going to be pretty busy over the next week or so).