by [anonymous]
7 min read

-19

[Edited continually.]

Abstract: This is a layman's description of entropy as understood in thermodynamics.  There is nothing revolutionary in here.  I'm not forwarding a new understanding of entropy, everything here can also be found in a thermodynamics textbook.  The purpose of this article is not to contradict the scientific understanding of entropy, but to attempt to elaborate it in layman's terms.  Entropy is not "That thing which results in the heat death of the universe," which tends to be the way some people, particularly science fiction authors and readers, tend to regard it.  Entropy, though it is an abstract description of underlying processes, is, understood as those processes, just as necessary to our existence as gravity.  That is the purpose of this article; to clear up that particular misunderstanding which is so pervasive in some circles.

On TimS's suggestion, I'll frame this description around the basic understanding most people have about entropy: It's that law of thermodynamics which prevents perpetual motion machines from happening.  So we'll consider three universes - one with backwards entropy.  One with no entropy.  And finally the universe we do live in.

To start with, imagine, for a moment, that entropy were reversed; that the amount of entropy in the universe were constantly -decreasing-.

What would this imply for us?

Approximately the same thing that increasing entropy implies; the amount of work we can extract from the universe is finite, dictated not by increasing energy homogeneity, but by decreasing.  In a universe in which entropy is reversed, rivers would flow uphill - planets would slowly disintegrate, in point of fact, into galactic clouds of gas.  We could extract work from this process, while it lasted, provided we could survive in such an environment to begin with.  (We've evolved to extract work in one direction.  Our cells would be no more capable of producing work in the other direction than a steam engine, which would require carbon dioxide fed deliberately into it, to be divided into oxygen and carbon.)

If all of this seems absurd, limit our consideration to a simple thing - Newtonian gravity.  Consider a closed system containing two particles, five meters apart, as our initial condition.  This is a higher potential energy state - a lower entropy state - than the final condition, in which those particles have collided, and the energy has been reexpressed as atomic vibrations, heat.  If entropy were reversed, those particles would never be permitted to collide; the law of physics would forbid it, because it would increase the entropy, in violation of our reversed law of entropy.

 

In the current universe, the work we can extract from matter is not merely limited by entropy, it is in fact permitted by it; it is the irreversibility of thermodynamic processes which permit machines to work to begin with.  A steam engine in which steam is as likely to contribute its engine to deoxidizing carbon and forming it into coal as it is to push a turbine is one in which the turbine moves fitfully, unpredictably, and in no particular direction.  The arrow of entropy is the not merely the limit on how much work can happen, it is also the mechanism by which work happens.  It does not, in fact, matter which direction it goes; as long as it is predictable, work could be extracted.  Rivers wouldn't flow without entropy.

Entropy is not a property of a system, but a property of forces; the law of entropy can be restated as "Forces do what forces do."  We measure entropy strictly in terms of things meaningful to use; it cannot be directly measured, because it doesn't exist.

Entropy is not an increase in homogeneity.  It's not an increase in the number of microstates.  These are products of forces.

And forces can work in opposing directions.  The number of -macrostates- is in constant decline; this is also a product of forces.  In macroscopic terms, heterogeneity, not homogeneity, is on the rise; consider that, according to Big Bang Theory, an early state of the universe was a nearly uniform cloud of gas.  Compare that to the macroscopic state of the universe today, a heterogeneous mess.  (It doesn't matter for this argument whether Big Bang Theory is true or not; if true, this behavior is exactly what physics would predict.)

Statistical mechanics doesn't contradict this, but frames it in terms of probability; when dealing with statistical distributions, i/e, a cloud of gas, it's a way of expressing mathematically what is happening on an individual basis to each of the atoms.  If you were capable of modeling each individual atom in the cloud of gas, you would arrive at the same conclusions.  Entropy isn't necessarily related to information, although it can be modeled that way very easily in statistical mechanics, because information about a statistical representation of an entropic process does vary in relation to the entropy.  (Which means that information-theoretic models can still model entropy for a statistical system.)

The mathematical models for statistical mechanics entropy and informational entropy are very similar, which has led some people, including myself, to an initial misapprehension that they were describing similar processes; one of my early understandings of entropy was that information about the universe was being encoded into the universe, and that quantum uncertainty had to increase elsewhere in order to accommodate this certainty.  I will provide an armchair logic proof of why this isn't necessary below.  First, as to why they are similar - this is because they are both modeling, and measuring, uncertainty about particular variables within the system.

Shannon Entropy - the measurement of entropy in information systems - is a measure of information which can be encoded in a given variable, and thus the measure of information which is lost when that variable is lost.  This ties into conventional entropy because of statistical mechanics, which is framed on the concept of a "microstate" - that is, a configuration of particles which can result in a "macrostate," which is an observable macroscopic state.  (A forest is still a forest if a tree is two inches further to the left; the forest is the macrostate, the state of each individual tree is a microstate.  From an airplane, the precise position of an individual tree doesn't matter; the macrostate is the same.  A particular macrostate is, loosely speaking, a collection of microstates such that it is casually impossible to identify which exact microstate it is in.)

A given macrostate of a cloud of gas can encode - provided you could read its microstate - a vast amount of information.  The exact amount is not important; "vast" is all that you need to know.  Statistical mechanics asserts that the most likely macrostate is the one which is described by the most possible microstates, or, to phrase this somewhat differently, the most likely state of matter is the one in which, if you could read and write to the microstate, the most information can be encoded.  This may sound like a remarkable claim up until you revisit the definition of a macrostate and one particular word - "possible."  (Note that statistical mechanics makes a justified assumption that all possible microstates are equally likely; Liouville's theorem proves this for the case that all possible microstates were at some point in the past equally likely.  Or, in other words, all microstates are equally likely, provided you have not actually read the microstate.)

That "possible" is pretty important.  Absent quantum fluctuations, which throw a wrench into the layman description (I'm sticking to classical mechanics because I'm judging that introducing the necessary "degrees of freedom" to explain behavior there is too messy for a layman), all your particles in a closed system can't, in any microstate, simultaneously adopt a leftward velocity; this state would violate the conservation of momentum.  Entropy in statistical mechanics is a different way of -measuring- entropy, but that entropy must still be the product of the laws of physics operating on individual particles.  Between point A and a later point B in time for a closed system X, entropy will -never- be less in point B than in point A.  The "statistical" in "statistical mechanics" doesn't grant the possibility that anything can happen; none of the possible microstates include "lower (mechanical) entropy over time."  (Again, classical mechanics.  For those interested in the quantum version of all this, I'll have to refer you to the concepts "degrees of freedom" and then "gauge theory".  If somebody can come up with a layman's description of that, by all means.)

[Draft section; will continue pending further review.]

Earlier I promised you an armchair proof that entropy is not necessarily information.  The armchair proof that entropy is not necessarily related to a universal concept of information, or level of system quantum uncertainty, is relatively simple - entropy still increases in a modeled system with 100% information about what is going on within it and no quantum uncertainty.  This is not to say that entropy -doesn't- represent system information or total quantum uncertainty, only that these concepts aren't necessary to entropy.  It also is not to say that entropic calculations do or do not directly represent -internal- uncertainty; it certainly limits the -amount- of information which can be represented in a universe, consistent with the statistical mechanics interpretation of entropy.  An armchair proof that entropy bounds the internal information storage capacity of a system is also relatively trivial; the process of information binding requires work, and entropy limits the amount of work that can be done within a system.

Entropy is, broadly speaking, a statement of the irreversibility of forces.  In a closed solar system, gravity means that eventually, everything will be in a stable configuration; a single concentrated ball, or dead objects orbiting each other.  There are several stable configurations, but each are local entropic maximums.  The irreversibility of entropy, in a final description, can also be treated as an expression of the fact that stability persists, and instability ends - and importantly that this is true of every scale.

It's not disorder.  It's not homogeneity.  It's not the number of states.  These expressions of entropy are expressions of particular forces.

New Comment
23 comments, sorted by Click to highlight new comments since:
[-]Shmi70

Too many crackpot red flags to count.

Is this original research? If so, may I recommend reading a physics textbook before attempting? If not, where are the references?

No. It's not research at all. It's a layman's description of entropy, the purpose of which is to avoid the relatively common mistake of misidentifying entropy as, for example, "disorder." There's nothing revolutionary in here, and a basic thermodynamics textbook will confirm the contents. The purpose is to reidentify entropy from "That niggling problem we need to overcome that leads to the heat death of the universe" to "The principle by which we're here to consider the problem of the heat death of the universe to begin with."

I've added a leading abstract to this effect.

Edited further for clarity.

[-]Shmi10

Still a number of contentious statements with no references. Misuse of the term "proof". Unexplained acronyms, undefined terms...

Contentious? Like what? (Seriously. I was working to avoid contentious statements.)

The word "proof" is prefaced with a modifier which warns the reader it is not a formal proof and may contain faults. It's not misuse under those conditions.

I'll expand BBT out to big bang theory. What terms are undefined?

[-]Shmi70

Examples of contentious statements:

  • In a universe in which entropy is reversed, rivers would flow uphill - planets would slowly disintegrate, in point of fact, into galactic clouds of gas.

  • Entropy is not a property of a system, but a property of forces; the law of entropy can be restated as "Forces do what forces do." We measure entropy strictly in terms of things meaningful to use; it cannot be directly measured, because it doesn't exist.

Undefined terms are everywhere, from forces, to micro/macro states to information.

A galactic cloud of gas contains less entropy than a planet. An object higher up a gravitational well has less entropy than one lower in it. Presuming the law of entropy is reversed, it would be impossible for an object to fall - that would be a violation of the reversed law of entropy. It would be capable, absent a decrease in entropy somewhere else to compensate ("work"), only of rising, if that only by random drift. Thus, rivers would flow uphill, and planets would slowly disintegrate. What's to contend?

Entropy -isn't- directly measured. It doesn't even exist as an inherent property of isolated matter; it exists only as a relative property when comparing two points in a system. I didn't get into degrees of freedom, but they figure into this; if we discover a new inherent property of matter that can be exploited for work, all previous entropic values go out the window. This also isn't contentious. Entropy is a product of calculation based on degrees of freedom.

Microstates and macrostates are used in their conventional thermodynamic sense; we're on the internet, a Google query will provide these definitions. Forces are used in their conventional physics sense.

Information is a messy one. I'll consider that one, but I consider it here only to address a particular misapprehension about entropy, so I'm not sure if that's important, since the context of that misapprehension means anybody possessed of it already knows what I'm talking about.

Entropy is, broadly speaking, a statement of the irreversibility of forces.

As long as you're sticking to classical physics, the fundamental laws are invariant under time reversal. So in what sense are forces "irreversible"? Take Newtonian gravitation, an example you use. If you take the time reverse of a world where massive particles interact according to Newton's law of gravitation, the resulting time-reversed world will also be one where massive particles interact according to Newton's law of gravitation. The fundamental forces do not pick out a direction of time, so how can they be the foundation for the entropic arrow of time?

In a closed solar system, gravity means that eventually, everything will be in a stable configuration; a single concentrated ball, or dead objects orbiting each other. There are several stable configurations, but each are local entropic maximums.

This claim, read literally, is false. It contradicts Liouville's theorem. The theorem says that in a Hamiltonian system, there cannot be a compression of accessible phase space with time. This means that our system cannot have any attractors or limit cycles. You, on the other hand, are saying that a closed solar system has attractors (a single concentrated ball) and limit cycles (dead objects orbiting each other). Basically you are saying that points of phase space which used to be available to the system (points that do not correspond to a single ball or stable orbits) will eventually become unavailable. This entails that phase space is being compressed.

The connection you are trying to draw between entropy and forces is not standard statistical mechanics. In fact, unless I am misinterpreting what you are saying, it contradicts standard statistical mechanics.

  • The fundamental forces do not pick out a direction of time, so how can they be the foundation for the entropic arrow of time?

I'm assuming that the law of entropy is obeyed. That bit about "Entropy being the product of forces" is important.

  • This claim, read literally, is false. It contradicts Liouville's theorem. The theorem says that in a Hamiltonian system, there cannot be a compression of accessible phase space with time.

The universe is not a Hamiltonian system. We have gravity, we have heat. Moreover, the phase space is maintained in the microstates.

I'm assuming that the law of entropy is obeyed. That bit about "Entropy being the product of forces" is important.

Your statement of the law of entropy is "Forces do what forces do". But the forces are doing what they do in the time reversed scenario as well. The forces don't change when you reverse time. So if the law of entropy is that forces do what they do, then why doesn't it hold in the time-reversed scenario?

When you say that entropy is the product of forces, I presume you mean that entropy increase is the product of forces acting (if that's now what you mean, then I don't know how to interpret your claim). My point is that entropy increase can't simply be a product of forces acting, because the time reversed situation is one in which the exact same forces act, yet entropy decreases. So entropy increase can't be derived from the force laws. But if the law of entropy is an independent law above and beyond the force laws, then in what sense is entropy the product of forces?

Perhaps all you mean is that there is no separate pressure that drives systems towards higher entropy states; that the only forces acting on systems are the fundamental forces, and the increase of entropy is a consequence of those forces acting. In this attenuated sense, it is true that entropy increase is the product of the forces acting, just as it is true that natural selection is the product of the forces acting or that business cycles are the product of the forces acting. If this is your point, it is getting occluded by your presentation. Saying things like "expressions of entropy are expressions of particular forces" suggests that you're making a different (and false) point.

The universe is not a Hamiltonian system.

Yes, but in the bit I quoted, you were talking about a closed solar system, not the universe. A closed solar system governed by Newtonian gravitation is a Hamiltonian system.

We have gravity, we have heat.

Gravitational systems are Hamiltonian. It is true that if a system dissipates heat it will not be Hamiltonian, but a closed solar system will not dissipate heat by definition, unless you mean something different by "closed" than standard physics usage.

Moreover, the phase space is maintained in the microstates.

Not sure why this relevant. There is a proper subset of phase space that corresponds to equilibrium macrostates. You are suggesting that no matter where in phase space the system begins, it will eventually end up somewhere in this subset and will remain within this subset. This contradicts Liouville's theorem.

I've deleted the post, so it doesn't matter too much, but I'll respond anyways:

This interpretation:

"Perhaps all you mean is that there is no separate pressure that drives systems towards higher entropy states; that the only forces acting on systems are the fundamental forces, and the increase of entropy is a consequence of those forces acting. In this attenuated sense, it is true that entropy increase is the product of the forces acting, just as it is true that natural selection is the product of the forces acting or that business cycles are the product of the forces acting. If this is your point, it is getting occluded by your presentation. Saying things like "expressions of entropy are expressions of particular forces" suggests that you're making a different (and false) point."

is correct, however I'll disagree about what my point suggests, as I regard my construction as semantically identical to "entropy increase is the product of the forces acting". There might be jargon I'm misusing, though, as I'm prone to that, so I have to concede there might be an implication I'm missing.

The point of my post was effectively that the law of entropy -isn't- an independent law; my elaborate and failed constructions were attempts to demonstrate this by reversing the law and showing that all the other laws of physics would necessarily stop working.

For the second point, I'll just repeat that I wasn't discussing a Hamiltonian system. A closed solar system isn't Hamiltonian unless you treat planets and the sun as point-masses, which I wasn't doing; gravitational tides in the particulate composition of planets slowly rob the system of momentum and convert it to heat. The whole thing eventually collapses on itself. This is the kind of situation I was trying to discuss. I guess I failed on that count as well. Shrug

As long as you're sticking to classical physics, the fundamental laws are invariant under time reversal. So in what sense are forces "irreversible"? Take Newtonian gravitation, an example you use. If you take the time reverse of a world where massive particles interact according to Newton's law of gravitation, the resulting time-reversed world will also be one where massive particles interact according to Newton's law of gravitation. The fundamental forces do not pick out a direction of time, so how can they be the foundation for the entropic arrow of time?

This is probably not what OrphanWilde wanted to mean with his statement, but I agree with you that it does seem incorrect the way it's stated, (classical) forces are indeed symmetric in time. As it's written, it looks like the increase in entropy is caused by the various forces defining an arrow of time. Instead they're more like two faces of the same medal: we see the forces operating in one time direction and we see an increase in entropy in the same direction. The same laws for the forces, with a reverse arrow of time, would cause a decrease in entropy. So, once you pick a preferred direction for the arrow of time, the laws for the forces dictate how entropy should evolve, but they don't choose the direction of the arrow for themselves.

This is a layman's description of entropy as understood in thermodynamics.

When a layman talks about entropy, he is usually just talking out of his ass. Entropy is a mathematical function that one might apply to physical situations. I don't see any equations here.

You could look into Jaynes and his work on maximum entropy thermodynamics. I think it's a minority viewpoint, but to paraphrase one of my favorite Nietzsche quotes - only the day after tomorrow belongs to Jaynes. That should at least familiarize you with the equations and the physics, and what one capable guy thought about it all. I don't know how much of it I've grokked. I'd like to have a better grasp, but I'm pretty confident that you're just grasping at air. Define your systems. Define how you're modeling them. Then write some equations. That's the beginning of making sense.

[-]TimS10

Not a physicist, but here's a question that might help clarify your point.

What is the different between a world without increasing entropy and a world in which perpetual motion machines are possible?

Hm. That's a very subtle question. And probably an important one to answer.

I think I'll try to reframe my post around that. Thank you.

[-][anonymous]10

It might be useful if you could talk about how entropy is used in different contexts. For instance, in information theory there is a rigorous, intuitive definition (Shannon entropy) that has nothing to do with "irreversibility of forces". The connections between information-theoretic and statistical physics conceptions of entropy might help people understand which usage/intuition is relevant.

Is the current revision useful in that regard, before I continue expanding on it? (It seems to me convoluted.)

I despise information theorists for adding -that- bit of confusion to the matter. (It was particularly irritating in my last year of college, when I was taking information theory and thermodynamics at the same time.) You have a point. Groan. Back to editing.

I discovered this website searching for the meaning of Entropy,. as a lay(wo)man became learned of the phenomena from a doco detailing the Universe,. which best described the state of my apartment,. it too has a black hole which consumed my new out of power mobile phone,. if I knew where the black hole was I'd read its surface to gain back all my contact details. That being said,. I am intrigued to no end by how many words and streams of scientific thought & reasoning surrounds the word "Entropy" .. truly fascinating stuff that.

Here's a YouTube video of Roger Penrose explaining entropy in terms of phase space, in 9 minutes. Time well spent!

A decent explanation of phase space, and utterly worthless for explaining why entropy is not a bad thing.

In other words, a universe where the arrow of time runs backwards?

Not quite [edit]what I was intending[/edit], but yes. The purpose of considering entropy in reverse is to redefine entropy in the minds of people here from a problem to "One of those necessary preconditions of our existence." I started off by suggesting a consideration of what reversing entropy would look like to reveal the issue inherent in it - that it merely leads to a -different- inevitable doom.