Naked Science Forum
General Science => General Science => Topic started by: Damiangage on 01/07/2020 09:54:00

This might be an ignorant question, but I'm struggling to fully understand the second law of thermodynamics.. people often talk in real world terms.. a broken glass won't spontaneously come back together, if its an irreversible process occurs entropy increases, etc.. but we know the universe is expanding, its getting colder, and eventually the energy in the universe will run out? Every star will die, be reborn, die, be reborn, but eventually die all together is that not correct? Galaxies increase their distant and effect on each other.. Eventually the universe will be a cold, dark place. That sounds like the simplest, lowest form of disorder imaginable, that is in our future.. Isn't that an example of our universe decreasing in entropy as time moves?
Perhaps we don't know for sure that is the universes fate, and I'm speculating on speculation? Further more doesn't having the option of reversibility increase disorder rather than increase it? Having more options increase the value of that variable? Sorry if this is a stupid question =D Help me understand!

Hi Damiangage; welcome.
This might be an ignorant question
It certainly isn’t.
There are a few points I would like to pick up, but I’m a bit pushed at pres. In any case, there are others more qualified than I, who will probably “pick up the conch”.
For now, just remember John von Neumann’s, advice to Claude Shannon, who was looking for a title for an idea he was developing. “You should call it entropy…[because] no one knows what entropy really is, so in a debate you will always have the advantage.”

To see if we can kickstart this, I’m posting a brief outline of my understanding of the background of entropy. There should be enough errors in that to stimulate a response. :)
One problem with entropy is that its definition and usage have changed over time. The definition of entropy, as originally conceived in classical thermodynamics, had nothing to do with order or disorder. It had everything to do with how much heat energy was stored or trapped in a body at a given temperature.
An early equation was: S=Q/T, where S is entropy; Q is the heat content; and T is the temperature of the body.
Close on the heels of the Industrial Revolution, physics was experiencing its own profound upheavals. With a growing belief in atoms as physical realities that could no longer be ignored, entropy took on a different significance. The realisation, for example, that gas was composed of a collection of molecules, that these molecules were constantly in motion, and that the pressure of gas in a container was a measure of the average energy of the molecules as they collided with one another, or hit the sides of the of that container, people began to look at entropy in different ways. Ludwig Boltzmann used the infant techniques of statistical analysis to define entropy in terms of atoms and their restless interactions. He distinguished between the macroscopic states; those we could observe; and microscopic states that could, then, be investigated only indirectly, and he used entropy to make the link between them. Boltzmann reasoned that entropy could be defined in terms of the number of microscopic arrangements of atoms that appear indistinguishable from a macroscopic perspective. It was reasoned that the more ways a system could move internally, the more molecular kinetic energy that system could hold for a given temperature. Temperature was, thus, defined as the average kinetic energy per mode of movement.
We can consider that each mode can hold a specific amount of kinetic energy, so the more modes a system has, the more total kinetic energy that system contains. It follows from this that the greater the number of kinetic energy modes, the greater is the entropy of the system. Thus, at atomic level, entropy is just a measure of the total number of molecular kinetic energy modes in a given system.
Boltzmann had formalised a good working definition of entropy, but he was not the sort of person who would leave things as vague as that, so he set about giving this idea a sound mathematical basis.
Boltzmann assigned the letter “S” to entropy and “W” to the number of microstates. He then calculated that “S” had a numerical value equal to the logarithm of “W”, modified by “k”, which became known as Boltzmann’s constant, which = 1.38 x 10^{23} joules per Kelvin. Thus, the deceptively simple looking equation was: S=k log W.
All too often, entropy is presented as being a measure of disorder. Possibly, Boltzmann is to blame for this, as it was he who introduced the analogy in an attempt to “sell” his revolutionary ideas to fellow scientists, many of whom were still kicking against atomic theory. As far as I am aware, he never said that entropy was a measure of disorder. However, this is how I first met it, in the early 1980s, on an Open University science foundation course, and this is how I understood it for the next 20 years, or so. Experts often present it in this way, but that is a mixed blessing.

https://alevelchemistry.co.uk/notes/ratesequilibriumandph/#:~:text=Entropy,Entropy%20is%20a%20measure%20of%20the%20disorder%20of%20a%20system,of%20maximum%20chaos%20or%20disorder.
“Entropy is a measure of the disorder of a system. The law of disorder states that things move spontaneously in the direction of maximum chaos or disorder.”
Er – yes; or should that be no?

I hope this video helps.

Like the video – with reservations. Clearly, some of the points link to #2. Unfortunately he stops short of a clear definition of entropy in terms of “the number of microscopic arrangements of atoms that appear indistinguishable from a macroscopic perspective”, but it’s better “than the average bear”.

The Second Law of Thermodynamics states that the state of entropy of the entire universe, as an isolated system, will always increase over time. The second law also states that the changes in the entropy in the universe can never be negative.

Hi π5209806, welcome to TNS.
The Start of Time?
The second law of thermodynamics dictates that in a closed system, entropy tends to increase, or to remain constant. In spite of the presence of the word “tends”; the assertion “it never decreases” can still be found attached to many explanations. Clearly, this should give us pause for thought.
Unfortunately, I’ve misplaced the source of the quote “entropy tends to increase”, and don’t have time to search for it now. However, as stressed elsewhere, the law relating to the increase of entropy in a closed system is a statistical law. A spontaneous decrease is not physically impossible, it’s unlikely, in the extreme.
Sean Carroll points out that:
“The chances of any low entropy state evolving into a high entropy state is the same as that of any high entropy state evolving into a low entropy state.”

A few minutes available to add a bit to Carroll's quote.
He appears to say that entropy can, and perhaps should, evolve in either direction, with equal probability. So why does entropy increase? The simple answer is because there are vastly more potentially high entropy states than low entropy states towards which evolution can occur. Therefore, there is a much greater chance that any low entropy state will evolve into a higher entropy state. It's just statistics, what could (physically) happen is not observed, because the age of the Universe is not long enough for its occurrence to be remotely likely.

Here are some other videos trying to explain entropy.