If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.
I've been thinking about (and writing out my thoughts on) the real meaning of entropy in physics and how it relates to physical models. It should be obvious that entropy(physical system) isn't well-defined; only entropy(physical model, physical system) is defined. Here, 'physical model' might refer to something like the kinetic theory of gases, and 'physical system' would refer to, say, some volume of gas or a cup of tea. It's interesting to think about entropy from this perspective because it becomes related to the subjectivist interpretation of probability. I want to know if anyone knows of any links to similar ideas and thoughts.
Isn't all this just punning on definitions? If the particle velocities in a gas are Maxwell-Boltzmann distributed for some parameter T, we can say that the gas has "Maxwell-Boltzmann temperature T". Then there is a separate Jaynes-style definition about "temperature" in terms of the knowledge someone has about the gas. If all you know is that the velocities follow a certain distribution, then the two definitions coincide. But if you happen to know more about it, it is still the case that almost all interesting properties follow from the... (read more)