So here's the thing. Entropy in physics is defined as [...]
That is one definition. It is not the only viable way to define entropy. (As you clearly know.) The recent LW post on entropy that (unless I'm confused) gives the background for this discussion defines it differently, and gives the author's reasons for preferring that definition.
(I am, I take it like you, not convinced that the author's reasons are cogent enough to justify the claim that the probabilistic definition of entropy is the only right one and that the thermodynamic definition is wrong. If I have given a different impression, then I have screwed up and I'm sorry.)
"Log of #possibilities" doesn't have any probabilities in it, but only because it's a deliberate simplification, targetting the case where all the probabilities are roughly equal (which turns out not to be a bad approximation because there are theorems that say most states have roughly equal probability and you don't go far wrong by pretending those are the only ones and they're all equiprobable). The actual definition, of course, is the "- sum of p log p" one, which does have probabilities in it.
So, the central question at issue -- I think -- is whether it is an error to apply the "- sum of p log p" definition of entropy when the probabilities you're working with are of the Bayesian rather than the frequentist sort; that is, when rather than naively counting states and treating them all as equiprobable you adjust according to whatever knowledge you have about the system. Well, of course you can always (in principle) do the calculation; the questions are (1) is the quantity you compute in this way of any physical relevance? and (2) is it appropriate to call it "entropy"?
Now, for sure your state of knowledge of a system doesn't affect the behaviour of a heat engine constructed without the benefit of that knowledge. If you want to predict its behaviour, then (this is a handwavy way of speaking, but I like it) the background knowledge you need to apply when computing probabilities is what's "known" by the engine. And of course you end up with ordinary thermodynamic entropy. (I am fairly sure no one who has been talking about entropy on LW recently would disagree.)
But suppose you know enough about the details of a system that the entropy calculated on the basis of your knowledge is appreciably different from the thermodynamic entropy; that is, you have extra information about which of its many similar-looking equal-energy states it's more likely to be in. Then (in principle, as always) you can construct an engine that extracts more energy from the system than you would expect from the usual thermodynamic calculations.
Does this make this "Bayesian entropy" an interesting quantity and justify calling it entropy? I think so, even though in almost all real situations it's indistinguishable from the thermodynamic entropy. If you start out with only macroscopic information, then barring miracles you're not going to improve that situation. But it seems to me that this notion of entropy may make for a simpler treatment of some non-equilibrium situations. Say you have a box with a partition in it, gas on one side and vacuum on the other. Now you remove the partition. You briefly have extra information about the state of what's in the box beyond what knowing the temperature, volume and pressure gives you, and indeed you can exploit that to extract energy even if once the gas settles down its temperature is the same as that of its environment. I confess I haven't actually done the calculations to verify that the "Bayesian" approach actually leads to the right answers; if (as I expect) it does, or can be adjusted in a principled way so that it does, then this seems like a nice way of unifying the equilibrium case (where you talk about temperature and entropy) and the non-equilibrium case (where you have to do something more resembling mechanics to figure out what energy you can extract and how). And -- though here I may just be displaying my ignorance -- I don't see how you answer questions like "10ms after the partition is removed, once the gas has started flowing into the previously empty space, but isn't uniformly spread out yet, what's the entropy of the system?" without something resembling the Bayesian approach, at least to the extent of not assuming all microstates are equally probable.
[EDITED to add: I see you've already commented on the "extracting energy from a thermodynamically hot thing whose microstate is known" thing, your answer being that the machine you do it with needs to be very cold and that explains how you get energy out. But I haven't understood why the machine has to be very cold. Isn't it, in fact, likely to have lots of bits moving very fast to match up somehow with the molecules it's exploiting? That would make it hot according to the thermodynamic definition of temperature. I suppose you might argue that it's really cold because its state is tightly controlled -- but that would be the exact same argument that you reject when it's applied to the hot thing the machine is exploiting its knowledge of.]
OK this is in fact interesting. In an important sense you have already won, or I have learned something, whichever description you find less objectionable.
I still think that the real definition of entropy is as you originally said, the log of the number of allowable states, where allowable means "at the same total energy as the starting state has." To the extent entropy is then used to calculate the dynamics of a system, this unambiguous definition will apply when the system moves smoothly and slowly from one thermal equilibrium to another, a...
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Previous Open Thread
Next Open Thread
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.