You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

mwengler comments on Open thread, Jan. 19 - Jan. 25, 2015 - Less Wrong Discussion

3 Post author: Gondolinian 19 January 2015 12:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (302)

You are viewing a single comment's thread. Show more comments above.

Comment author: mwengler 20 January 2015 12:59:25PM 4 points [-]

There are approximations in figuring entropy and thermal statistics that may be wrong in very nearly immeasurable ways. The one that used to stick in my head was the calculation of the probability of all the gas in a volume showing up briefly in one-half the volume. Without doing math I figured it is actually much less than the classic calculated result, because the classic result assumes zero correlation between where any two molecules are, and once any kind of significant density difference exists between the two sides of the volume this will break.

But entropy is still real in the sense that it is "out there." An entire civilization is powered (and cooled) by thermodynamic engines, engines which quite predictable provide useful functionalities in ways predictable in detail from calculations of entropy.

A glass of hot water burns your skin even if you know the water and the skin's precise characterization in parameter space before they come in contact. Fast moving (relative to the skin) molecules of water break the bonds of some bits of skin they come in contact with. On the micro scale it may look like a scene from the matrix with a lot of slow moving machine gun bullets. The details of the destruction may be quite beautiful and "feel" cold, but essentially thanks to the central limit theorem, a whole lot of what happens will be predictable in a quite useful, and quite unavoidable way without having to appeal to the detail.

I think the only sense in which you can extract energy from water with a specially built machine that is custom designed for the current parameter space of the water, it is the machine which is at 0 or at least low temperature. And so the fact that useful energy can be extracted from the interaction of finite temperature water and a cold machine is totally consistent with entropy being real, thermal differences can power machines. And they do, witness the cars, trucks, airplanes and electric grid that are essential for our economy. The good news is you can get all the energy you need without knowing the detailed parameter space of the hot water, which is helpful because you then don't have to redesign your cold machine every few microseconds as you bring in new hot water to it from which to extract the next bit of energy.

Entropy is as real as energy whether it feels that way or not, and that is why machines work even when left unattended by consciousnesses to perceive their entropy and its flows.

Comment author: passive_fist 20 January 2015 08:04:57PM *  1 point [-]

I think you're getting several things wrong here.

because the classic result assumes zero correlation between where any two molecules are, and once any kind of significant density difference exists between the two sides of the volume this will break.

The assumption of zero correlation is valid for ideal gases. It will not break if there is a density difference. We're talking about statistical correlation here.

Entropy is as real as energy whether it feels that way or not, and that is why machines work even when left unattended by consciousnesses to perceive their entropy and its flows.

"Entropy is in the mind" doesn't mean that you need consciousness for entropy to exist. All you need is a model of the world. Part of Jaynes' argument is that even though probabilities are subjective, entropy emerges as an objective value for a system (provided the model is given), since any rational Bayesian intelligence will arrive at the same value, given the same physical model and same information about the system.

Comment author: mwengler 20 January 2015 11:26:31PM 0 points [-]

because the classic result assumes zero correlation between where any two molecules are, and once any kind of significant density difference exists between the two sides of the volume this will break.

The assumption of zero correlation is valid for ideal gases. It will not break if there is a density difference. We're talking about statistical correlation here.

Statistical independence means the chance that a molecule is at a particular spot depends not at all on where the other molecules are. Certainly if the molecules never hit each other, they only bounce off the walls of the volume, then this would be true as the molecules don't interract with each other so their probability of being one place or another is not changed by putting the other molecules anywhere, as long as they don't interract.

But molecules in a gas do interact they bounce off each other. Even an ideal gas. There is an average distance they travel before bouncing off another molecule called a mean free path. A situation where the mean free path is << size of volume is typical at STP.

Does this interaction break non-correlation? My intuition is that it does. But the thing I know for sure is that the only derivation I have ever seen for calculating the probability that all the gas is in 1/2 the volume was done with the assumptions of zero correlations, which we only know is the case for zero interaction, which is NOT an assumption required in the ideal gas models. And is certainly not true of any real gases.

"Entropy is in the mind" doesn't mean that you need consciousness for entropy to exist. All you need is a model of the world.

This is as true for Entropy as it is for Energy. By this standard, Entropy and Energy are both in the mind, neither one is "realer" than the other.

Comment author: spxtr 21 January 2015 04:53:49AM 0 points [-]

Entropy is in the mind in exactly the same sense that probability is in the mind. See the relevant Sequence post if you don't know what that means.

The usual ideal gas model is that collisions are perfectly elastic, so even if you do factor in collisions they don't actually change anything. Interactions such as van der Waals have been factored in. The ideal gas approximation should be quite close to the actual value for gases like Helium.

Comment author: mwengler 21 January 2015 06:26:46AM 0 points [-]

See the relevant Sequence post if you don't know what that means.

Without a link! So I went to the sequences page in the wiki and the word entropy doesn't even appear on the page! Good job referring me there without a link.

Entropy is in the mind in exactly the same sense that probability is in the mind.

Okay... Is that the same sense in which Energy is in the mind? Considering that this seems to be my claim that you are responding to, AND there is no reasonable way to get to a sequence page that corresponds to your not-quite-on-topic-but-not-quite-orthogonal response, that would be awfully nice to know.

Are you agreeing with me and amplifying, or disagreeing with me and explaining?

Comment author: spxtr 21 January 2015 06:32:11AM 0 points [-]
Comment author: mwengler 21 January 2015 06:49:36AM *  1 point [-]

THank you.

The thing that leaps out at me is that the rhetorical equation in that article between the sexiness of a woman being in the mind and the probability of two male children being in the mind is bogus.

I look at a woman and think she is sexy. If I assume the sexiness is in the woman, and that an alien creature would think she is sexy, or my wife would think she is sexy, because they would see the sexiness in her, then the article claims I have been guilty of the mind projection fallacy because the woman's sexiness is in my mind, not in the woman.

The article then proceeds to enumerate a few situations in which I am given incomplete information about reality and each different scenario corresponds to a different estimate that a person has two boy children.

BUT... it seems to me, and I would love to know if Eliezer himself would agree, even an alien given the same partial information would, if it were rational and intelligent, reach the same conclusions about the probabilities involved! So... probability, even Bayesian probability based on uncertainty is no more or less in my head than is 1+1=2. 1+1=2 whether I am an Alien mind or a Human mind, unlike that woman is sexy which may only be true in heterosexual male, homosexual female, and bisexual human minds, but not Alien minds.

But be that as it may, your comment still ignores the entire discussion, which is is Entropy and more or less "real" than Energy? The fact is that Aliens who had steam engines, internal combustion engines, gas turbines, and air conditioners would almost certainly have thermodynamics, and understand entropy, and agree with Humans on the laws of thermodynamics and the trajectories of entropy in the various machines.

If Bayesian probability is in the mind, and Entropy is in the mind, then they are like 1+1=2 being in the mind, things which would be in the mind of anything which we considered rational or intelligent. They would NOT be like "sexiness."

Comment author: gjm 21 January 2015 03:09:00PM 1 point [-]

Probability depends on state of knowledge, which is a fact about your mind. Another agent with the same state of knowledge will assign the same probabilities. Another agent fully aware of your state of knowledge will be able to say what probabilities you should be assigning.

Sexiness depends on sexual preferences, which are a fact about your mind. Another agent with the same sexual preferences will assess sexiness the same way. Another agent fully aware of your sexual preferences will be able to say how sexy you will find someone.

I don't see that there's a big difference here. Except maybe for the fact that "states of knowledge", unlike "sexual preferences", can (in principle) be ranked: it's just plain better for your state of knowledge to be more accurate.

Comment author: mwengler 21 January 2015 09:53:42PM 0 points [-]

Well yes. Of course everything you can say about probability and sexiness you can say about Energy, Entropy, and Apple. That is, the estimate of the energy or entropy relationships in a particular machine or experimental scenario depend on the equations for energy and entropy, the measurements you make on the system to find the values of the elements that go into those equations. Any mind with the same information will reach the same conclusions about the Energy and Entropy that you would, assuming you are all doing it "right." Any intelligence desiring to transform heat producing processes into mechanical or electrical energy will even discover the same relationships to calculate energy and entropy as any other intelligence and will build similar machines, machines that would not be too hard for technologists from the other civilization to understand.

Even determining if something is an apple. Any set of intelligences that know the definitions of apples common among humans on earth will be able to look at various earth objects and determine which of them are apples, which are not, and which are borderline. (I'm imagining there must be some "crabapples" that are marginally edible that people would argue over whether to call apples or not, as well as a hybrid between an apple and a pear that some would call an apple and some wouldn't).

So "Apple" "Sexy" "Entropy" "Energy" and "Probability" are all EQUALLY in the mind of the intelligence dealing with them.

If you check, you will see this discussion started by suggesting that Energy was "realer" than Entropy. That Entropy was more like Probability and Sexiness, and thus, not as real, while Energy was somehow actually "out there" and therefore realer.

My contention is that all these terms are equally as much in the mind as in reality, that as you say any intelligence who knows the definitions will come up with the same conclusions about any given real situation, and that there is no distinction in "realness" between Energy and Entropy, no distinction between these and Apple, and indeed no distinction between any of these and "Bayesian Probability." That pointing out that features of the map are not features of the territory does NOT allow you to privilege some descriptive terms as being "really" part of the territory after all, even though they are words that can and should obviously be written down on the map.

If you are going to explicate further, please state whether you agree or disagree that some of these terms are realer than others, as this is how the thread started and open-ended explication is ambiguous.

Comment author: gjm 21 January 2015 10:43:05PM 1 point [-]

So "Apple" "Sexy" "Entropy" "Energy" and "Probability" are all EQUALLY in the mind of the intelligence dealing with them.

Anything at all is "in the mind" in the sense that different people might for whatever reason choose to define the words differently. Because this applies to everything, it's not terribly interesting and usually we don't bother to state it. "Apple" and "energy" are "in the mind" in this sense.

But (in principle) someone could give you a definition of "energy" that makes no reference to your opinions or feelings or health or anything else about you, and be confident that you or anyone else could use that definition to evaluate the "energy" of a wide variety of systems and all converge on the same answer as your knowledge and skill grows.

"Entropy" (in the "log of number of possibilities" sense) and "probability" are "in the mind" in another, stronger sense. A good, universally applicable definition of "probability" needs to take into account what the person whose probability it is already knows. Of course one can define "probability, given everything there is to know about mwengler's background information on such-and-such an occasion" and everyone will (in principle) agree about that, but it's an interesting figure primarily for mwengler on that occasion and not really for anyone else. (Unlike the situation for "energy".) And presumably it's true that for all (reasonable) agents, as their knowledge and skill grow, they will converge on the same probability-relative-to-that-knowledge for any given proposition -- but frequently that won't in any useful sense be "the probability that it's true", it'll be either 0 or 1 depending on whether the proposition turns out to be true or false. For propositions about the future (assuming that we fix when the probability is evaluated) is might end up being something neither 0 nor 1 for quantum-mechanical reasons, but that's a special case.

Similarly, entropy in the "log of number of possibilities" sense is meaningful only for an agent with given knowledge. (There is probably a reasonably respectable way of saying "relative to what one could find out by macroscopic observation, not examining the system too closely", and I think that's often what "entropy" is taken to mean, and that's fine. But that isn't quite the meaning that's being advocated for in this post.)

Sexiness is "in the mind" in an even stronger sense, I suppose. But I think it's reasonable to say that on the scale from "energy" to "sexiness", probability is a fair fraction of the way towards "sexiness".

Comment author: mwengler 21 January 2015 06:33:32AM -1 points [-]

The usual ideal gas model is that collisions are perfectly elastic, so even if you do factor in collisions they don't actually change anything.

They don't change ANYTHING? Suppose I start with a gas of molecules all moving at the same speed but in different directions, and they have elastic collisions off the walls of the volume. If they do not collide with each other, they never "thermalize," their speeds stay the same forever as they bounce off the walls but not off each other. But if they do bounce off each other, the velocity distribution does become thermalized by their collisions, even when these collisions are elastic. So collisions don't chage ANYTHING? They change the distribution of velocities to a thermal one, which seems to me to be something.

The ideal gas approximation should be quite close to the actual value for gases like Helium.

So even if an ideal gas maintained perfect decorrelation between molecule positions in an ideal gas with collisions, which I do not think you can demonstrate (and appealing to an unlinked sequence does not count as a demonstration), you would still have to face the fact that an actual gas like Helium would be "quite close" to uncorrelated, which is another way of saying... correlated.

Comment author: Viliam_Bur 20 January 2015 02:44:37PM 0 points [-]

Both the "entropy is in the mind" and "entropy is real" explanations seem plausible to me (well, I am not a physicist, so anything may seem plausible), so now that I think about it... maybe the problem is that even if we would be able to know a lot of stuff, we might still be limited in ways we can use this knowledge. And the knowledge you can't realistically use, it's as if you wouldn't even have it.

So, in theory, there could be a microscopical demon able to travel between molecules of boiling water without hitting any of them -- so from the demon's point of view, there is nothing hot about that water -- the problem is that we cannot do this with real stuff; not even with nanomachines probably. Calculating the path for the nanomachine would be computationally too expensive, and it is probably too big to fit between the molecules. So the fact is that a few molecules are going to hit that nanomachine, or any greater object, anyway.

Or perhaps we could avoid the whole paradox by saying: "Actually no, you cannot have the knowledge about all molecules of the boiling water. How specifically would you get it, and how specifically would you keep it up to date?"

Comment author: passive_fist 20 January 2015 08:07:21PM *  0 points [-]

This is pretty much it, and it's a really subtle detail that causes a lot of confusion. This is why the real problem with Maxwell's demon isn't how you obtain the information, it's how you store the information, as Landauer showed. To extract useful work you have to erase bits ('forget' knowledge) at some point. And this raises the entropy.