rocurley comments on Natural Laws Are Descriptions, not Rules - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (234)
There's a widely acknowledged problem involving the Second Law of Thermodynamics. The problem stems from the fact that all known fundamental laws of physics are invariant under time reversal (well, invariant under CPT, to be more accurate) while the Second Law (a non-fundamental law) is not. Now, why is the symmetry at the fundamental level regarded as being in tension with the asymmetry at the non-fundamental level? It is not true that solutions to symmetric equations must generically share those same symmetries. In fact, the opposite is true. It can be proved that generic solutions of systems of partial differential equations have fewer symmetries than the equations. So it's not like we should expect that a generic universe describable by time-reversal symmetric laws will also be time-reversal symmetric at every level of description. So what's the source of the worry then?
I think it comes from a commitment to nomic reductionism. The Second Law is, well, a law. But if you really believe that laws are rules, there is no room for autonomous laws at non-fundamental levels of description. The law-likeness, or "ruliness", of any such law must really stem from the fundamental laws. Otherwise you have overdetermination of physical behavior. Here's a rhetorical question taken from a paper on the problem: "What grounds the lawfulness of entropy increase, if not the underlying dynamical laws, the laws governing the world's fundamental physical ontology?" The question immediately reveals two assumptions associated with thinking of laws as rules: the lawfulness of a non-fundamental law must be "grounded" in something, and this grounding can only conceivably come from the fundamental laws.
So we get a number of attempts to explain the lawfulness of the Second Law by expanding the set of fundamental laws, Examples include Penrose's Weyl curvature hypothesis and Carroll and Chen's spontaneous eternal inflation model. These hypotheses are constructed specifically to account for lawful entropy increase. Now nobody thinks, "The lawfulness of quantum field theory needs grounding. Can I come up with an elaborate hypothesis whose express purpose is accounting for why it is lawful?" (EDIT: Bad example. See this comment) The lawfulness of fundamental laws is not seen as requiring grounding in the same way as non-fundamental laws. If you think of laws as descriptions rather than rules, this starts to look like an unjustified double standard. Why would macroscopic patterns require grounding in a way that microscopic patterns do not?
I can't fully convey my own take on the Second Law issue in a comment, but I can give a gist. The truth of the Second Law depends on the particular manner in which we partition phase space into macrostates. For the same microscopic trajectory through phase space, different partitions will deliver different conclusions about entropy. We could partition phase space so that entropy decreases monotonically (for some finite length of time), increases monotonically, or exhibits no monotonic trend. And this is true for any microscopic trajectory through any phase space. So the existence of some partition according to which the Second Law is true is no surprise. What does require explanation is why this is the natural partition. But which partition is natural is explained by our epistemic and causal capacities. The natural macrostates are the ones which group together microstates which said capacities cannot distinguish and separate microstates which they can. So what needs to be explained is why our capacities are structured so as to carve up phase space in a manner that leads to the Second Law. But this is partly a question about us, and it's the sort of question that invites an answer based on an observation selection effect -- something like "Agency is only possible if the system's capacities are structured so as to carve up its environment in this manner." My view is that the asymmetry of the Second Law is a consequence of an asymmetry in agency -- the temporal direction in which agents can form and read reliable records about a system's state must differ from the temporal direction in which an agent's action can alter a system's state. I could say a lot more here but I won't.
The point is that this sort of explanation is very different from the kind that most physicists are pursuing. I'm not saying it's definitely the right tack to pursue, but it is weird to me that it basically hasn't been pursued at all. And I think the reason for that is that it isn't the kind of grounding that the prescriptive viewpoint leads one to demand. So implicit adherence to this viewpoint has in this case led to a promising line of inquiry being largely ignored.
First of all, thank you for your detailed reply.
I think this is near to the core of our disagreement. It seems self-evident that two true laws/descriptions cannot give different predictions about the same system; otherwise, they would not both be true. If two mathematical objects (as laws of physics tend to be) always yield the same results, it seems natural to try and prove their equivalence. For example, when I learned Lagrangian mechanics in physics class, we proved it equivilent to Newtonian mechanics.
So the question arises, "why should the Second Law of Thermodynamics be proved in terms of more "fundamental" laws, rather than the other way around?" (this, if I'm interpreting you correctly, is the double standard). This is simply because the Second Law's domain in which it can make predictions is much smaller than that of more fundamental laws. The second law of thermodynamics is silent about what happens when I dribble a ball; Newton's laws are not. As such, one proves the Seccond law in terms of non-thermodynamic laws. "Fundamentalness" seems to simply be a description of domain of applicability.
I'm not qualified to assess the validity of the Weyl curvature hypothesis or of the spontaneous eternal inflation model. However, I've always understood that the increase in entropy is simply caused by the boundry conditions of the universe, not any time-asymmetry of the laws of physics.
It's self-evident that that two true laws/descriptions can't give contradictory predictions, but in the example I gave there is no contradiction involved. The laws at the fundamental level are invariant under time reversal, but this does not entail that a universe governed by those laws must be invariant under time reversal, so there's nothing contradictory about there being another law that is not time reversal invariant.
What do you mean by "yield the same results"? The Second Law makes predictions about the entropy of composite systems. The fundamental laws make predictions about quantum field configurations. These don't seem like yielding the same results. Of course, the results have to be consistent in some broad sense, but surely consistency does not imply equivalency. I think the intuitions you describe here are motivated by nomic reductionism, and they illustrate the difference between thinking of laws as rules and thinking of them as descriptions.
No. I don't take it for granted that either law can be reduced to the other one. It is not necessary that the salient patterns at a non-fundamental level of description are merely a consequence of salient patterns at a lower level of descriptions.
Well, yes, if the Second Law holds, then the early universe must have had low entropy, but many physicists don't think this is a satisfactory explanation by itself. We could explain all kinds of things by appealing to special boundary conditions but usually we like our explanations to be based on regularities in nature. The Weyl curvature hypothesis and spontaneous eternal inflation are attempts to explain why the early universe had low entropy.
Incidentally, while there are many heuristic arguments that the early universe had a low entropy (such as appeal to its homogeneity), I have yet to see a mathematically rigorous argument. The fact is, we don't really know how to apply the standard tools of statistical mechanics to a system like the early universe.
The entropy of a system can be calculated from the quantum field configurations, so predictions about them are predictions about entropy. This entropy prediction must math that of the laws of thermodynamics, or the laws are inconsistent.
This is incorrect. Entropy is not only dependent upon the microscopic state of a system, it is also dependent upon our knowledge of that state. If you calculate the entropy based on an exact knowledge of the microscopic state, the entropy will be zero (at least for classical systems; quantum systems introduce complications), which is of course different from the entropy we would calculate based only on knowledge of the macroscopic state of the system. Entropy is not a property that can be simply reduced to fundamental properties in the manner you suggest.
In any case, even if it were true that full knowledge of the microscopic state would allow us to calculate the entropy, it still wouldn't follow that knowledge of the microscopic laws would allow us to derive the Second Law. The laws only tell us how states evolve over time; they don't contain information about what the states actually are. So even if the properties of the states are reducible, this does not guarantee that the laws are reducible.
I'm a bit skeptical of your claim that entropy is dependent on your state of knowledge; It's not what they taught me in my Statistical Mechanics class, and it's not what my brief skim of Wikipedia indicates. Could you provide a citation or something similar?
Regardless, I'm not sure that matters. Let's say you start with some prior over possible initial microstates. You can then time evolve each of these microstates separately; now you have a probability distribution over possible final microstates. You then take the entropy of the this system.
I agree that some knowledge of what the states actually are is built into the Second Law. A more careful claim would be that you can derive the Second Law from certain assumptions about initial conditions and from laws I would claim are more fundamental.
Sure. See section 5.3 of James Sethna's excellent textbook for a basic discussion (free PDF version available here). A quote:
"The most general interpretation of entropy is as a measure of our ignorance about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved quantities... This interpretation -- that entropy is not a property of the system, but of our knowledge about the system (represented by the ensemble of possibilities) -- cleanly resolves many otherwise confusing issues."
The Szilard engine is a nice illustration of how knowledge of a system can impact how much work is extractable from a system. Here's a nice experimental demonstration of the same principle (see here for a summary). This is a good book-length treatment of the connection between entropy and knowledge of a system.
Yes, but the prior over initial microstates is doing a lot of work here. For one, it is encoding the appropriate macroproperties. Adding a probability distribution over phase space in order to make the derivation work seems very different from saying that the Second Law is provable from the fundamental laws. If all you have are the fundamental laws and the initial microstate of the universe then you will not be able to derive the Second Law, because the same microscopic trajectory through phase space is compatible with entropy increase, entropy decrease or neither, depending on how you carve up phase space into macrostates.
EDITED TO ADD: Also, simply starting with a prior and evolving the distribution in accord with the laws will not work (even ignoring what I say in the next paragraph). The entropy of the probability distribution won't change if you follow that procedure, so you won't recover the Second Law asymmetry. This is a consequence of Liouville's theorem. In order to get entropy increase, you need a periodic coarse-graining of the distribution. Adding this ingredient makes your derivation even further from a pure reduction to the fundamental laws.
In any case, it is not so clear that even the procedure you propose works. The main account of why the entropy was low in the early universe appeals to the entropy of the gravitational field as compensation for the high thermal entropy of the initial state. As of yet, I haven't seen any rigorous demonstration of how to apply the standard tools of statistical physics to the gravitational field, such as constructing a phase space which incorporates gravitational degrees of freedom. Hawking and Page attempted to do something like this (I could find you the citation if you like, but I can't remember it off the top of my head), but they came up with weird results. (ETA: Here's the paper I was thinking of.) The natural invariant measure over state space turned out not to be normalizable in their model, which means that one could not define sensible probability distributions over it. So I'm not yet convinced that the techniques we apply so fruitfully when it comes to thermal systems can be applied to universe as a whole.
Dang, you're right. I'm still not entirely convinced of your point in the original post, but I think I need to do some reading up in order to:
This has been an interesting thread; I hope to continue discussing this at some point in the not super-distant future (I'm going to be pretty busy over the next week or so).