Naked Science Forum
Non Life Sciences => Physics, Astronomy & Cosmology => Topic started by: Bill S on 07/01/2019 15:23:33

Entropy is just a natural outcome of statistics. The fact that it is increasing only means that the universe started in a highly ordered state, and is still much more ordered than it ultimately will (can) be.
Re: Does time stand still in the quantum world? #13.
There are a few points I would like to clarify from this post, let's start with this one. This has echoes of Sean Carroll’s assertion that a “Past Hypothesis”, demanding a low entropy boundary condition, is necessary in order to justify an expanding universe theory.
Moving away from the idea of entropy as a measure of disorder, and thinking of it as the number of accessible energy eigenstates, how reasonable is it to think of the first instant of the Universe as having low entropy.
If all the matter and energy in the Universe today were “packed” into an infinitesimally small “speck”; surely, the number of accessible energy eigenstates would have been zero, or as near to that as uncertainty permits.

I am not confident to claim anything along the lines of an infinitesimal initial state, although that would likely have little entropy (though the holographic principle invoked to model information in black holes might have some bearing here...) I am also not so certain that we need a boundary conditionI have discussed this elsewhere (https://www.thenakedscientists.com/forum/index.php?topic=73163.0), but I don't see a clear reason that there couldn't be an asymptotic approach to that boundary, which then essentially makes the actual boundary irrelevant...
However, it also seems almost selfevident (*warning bells*) that the early universe must have been more ordered than it is now. Given how many particles there are in the universe, it is fantastically unlikely that entropy would randomly decrease at any given moment, let alone several successive moments. And seeing as we are in a fairly highly ordered state compared to how things could be, it would appear that we are all that much less likely to have ever been in a more disordered state.
I realize that I am using "order" and "disorder," here, and you are asking about states. It's just easier to talk about... I gotta go now, but we can delve deeper soon...

Entropy is just a natural outcome of statistics. The fact that it is increasing only means that the universe started in a highly ordered state, and is still much more ordered than it ultimately will (can) beEntropy is just a natural outcome of statistics. The fact that it is increasing only means that the universe started in a highly ordered state, and is still much more ordered than it ultimately will (can) be
How can entropy decrease on a universal scale, while having structures (atoms, planets, stars, galaxies, plants, life forms, etc) being formed by organizing entities (gravitational, chemical, nuclear, DNA, etc)?

https://www.thenakedscientists.com/forum/index.php?topic=73163.0
I’ll have to reprise that thread. The only thing I recall from it is that Alan said: “I think Einstein pointed out that time is what prevents everything from happening at once.”
I thought it was Wheeler, so I checked. It seems that Ray Cummings (a Sci Fi author) is the most likely candidate.

If entrophy is increasing universally, it is proceeding to equilibrium. This implies that Universe is a closed system. In an expanding, closed system, a catalyst is required to speedup/increase entrophy. What do you propose that catalyst would be?

If entrophy is increasing universally, it is proceeding to equilibrium. This implies that Universe is a closed system. In an expanding, closed system, a catalyst is required to speedup/increase entrophy. What do you propose that catalyst would be?
First: it's entropy, not entrophy (I wouldn't point it out if it were a lonely typo, but it was misspelled in every instance of the word in the previous post)
Second: increasing entropy doesn't imply a closed systemthough we typically define our universe to be closed anyway...
Third: No catalyst is required for entropy to increase, even at an increasing rate...

IF entrophy is increasing universally, it is proceeding to equilibrium. This implies that Universe is a closed system. In an expanding, closed system, a catalyst is required to speedup/increase entrophy. What do you propose that catalyst would be?
First: it's entropy, not entrophy (I wouldn't point it out if it were a lonely typo, but it was misspelled in every instance of the word in the previous post)
Second: increasing entropy doesn't imply a closed systemthough we typically define our universe to be closed anyway...
Third: No catalyst is required for entropy to increase, even at an increasing rate...
Thank you for pointing out my dyslexic condition out to me, I think your wrong about points two and three though. I will attempt to catch my spelling mistakes but never I never promise anything. Please provide documentation for supporting your position. below are mine.
"Following the second law of thermodynamics, entropy of an isolated system always increases for irreversible processes. The difference between an isolated system and closed system is that heat may not flow to and from an isolated system, but heat flow to and from a closed system is possible." plenty of wiggle room there if you accept the expansion of Universe as irreversible.
the cosmic radiation background of the Universe's temperature is remarkably consistent. This to me, maybe not to you, presents an ordered system. An ordered system presents as a constant stable rate of entropy. The IF in my first statement places all that follows as a hypothetical question as to what would cause an increase in universal entropy in a closed system. If a system is constant and ordered its entropy is constant and ordered. An expanding Universe is increasing in entropy, and you may wish to disagree with me ... but a catalyst is require to change the rate of entropy in a constant system. So, what catalyst caused the a constant stable thermal Universe to accelerate in conjunction with an increase in entropy? Its a simple question.
or and u misspelled misspelled, do have dyslexia also? lol.
.

I have just reread my post, please excuse the mistakes. For me dyslexia is a lot like stuttering, it easily aggravated by other people's reaction to it.

For those unacquainted with dyslexia, I can read what I have written and not see my mistakes. I only see my thoughts not my written response. its weird! lol

Thank you for pointing out my dyslexic condition out to me, I think your wrong about points two and three though. I will attempt to catch my spelling mistakes but never I never promise anything. Please provide documentation for supporting your position. below are mine.
"Following the second law of thermodynamics, entropy of an isolated system always increases for irreversible processes. The difference between an isolated system and closed system is that heat may not flow to and from an isolated system, but heat flow to and from a closed system is possible." plenty of wiggle room there if you accept the expansion of Universe as irreversible.
If Entropy were decreasing, that would prove that the universe is not a closed and isolated system. However, there are many examples of open systems with increasing entropy (like the cup of coffee on my desk into which the added cream is dispersing). Therefore, establishing that entropy is increasing in a system does NOT establish that the system is closed.
the cosmic radiation background of the Universe's temperature is remarkably consistent. This to me, maybe not to you, presents an ordered system.
To me this suggests that the whole universe is the same age, and that expansion has been constant across it. This may be an indication of order in some way, but it is not apparent to me.
An ordered system presents as a constant stable rate of entropy.
Entropy is not a process so it cannot have a rate. Entropy IS disorder, so an ordered system has less entropy than a more disordered system.
The IF in my first statement places all that follows as a hypothetical question as to what would cause an increase in universal entropy in a closed system. If a system is constant and ordered its entropy is constant and ordered.
I don't know what this means... (see above for what entropy is)
An expanding Universe is increasing in entropy,
I'll agree with you there.
and you may wish to disagree with me ... but a catalyst is require to change the rate of entropy in a constant system. So, what catalyst caused the a constant stable thermal Universe to accelerate in conjunction with an increase in entropy? Its a simple question.
As I mentioned above, entropy is not a process, so we can't talk about the rate of entropy. If I assume that you mean the rate of change of entropy, we still still don't need to invoke a catalyst. Increasing entropy is the nature of nature. So I don't see what the question is.
or and u misspelled misspelled, do have dyslexia also? lol.
Too bee fare, yew gout me their. Butt inn cases four witch won word wood mean x, and the other why, won out too bee specially carefree. ;)

But back to the point of this thread,
Another way to think about entropy is asking the question: what is the smallest amount of information I (let's be honest, not I, but some sort of demigod) would need to describe the system perfectly?
One might be able to come up with an enormous data table that includes the instantaneous position, momentum, velocity, acceleration, and all other moments for each particle in the universe, as well as every interaction between each particle. Even if the universe were finite, this would still be an inconceivably huge amount of information, whether the whole universe were a giant perfect crystalline lattice at absolute zero, or the hot mess that it actually is. The trick is that in the former case (the cold crystal), many of those data points are redundant. If every unit cell within the lattice is an exact copy, then we only need the information to describe a single unit cell, and then info on how many identical units there are, and what their arrangement is. Therefore it is a low entropy system.
One way (not necessarily correct) of thinking about how the entropy of a system increases, is that as time goes on, each of those particles gains history, which is somehow encoded into the whole system. So even if all of the particles are identical, their positions, velocities etc. have changed based on the sum of all of their interactions to date. Thanks to conservations laws we still can describe the total amount of energy or momentum or charge etc. with single numbers. But once we care to see how that energy, charge and everything is distributed within the system, then we need lots of extra information.
The thing that blew people's minds in the mid 19th century is that they realized that purely deterministic views of the universe couldn't capture this apparent increase of information. Because if it were perfectly deterministic, then one need only define the starting state, and how much time had elapsedmeaning that the state of the universe at every point in time would have to have the same entropy... which they could experimentally prove was not a realistic model.
It wasn't until QM came along that they were willing to relinquish the deterministic picture (and even still there were and are some holdouts)

There’s another point from "Does time stand still in the quantum world? I think is worth a mention".
Imagine four coins that can be either H or T. At regular intervals a coin is selected at random, and flipped such that it has a 50/50 chance of being H or T.
If the system starts out HHHH (the lowest possible entropy for the system), then after the first flip, entropy has a 50% chance of increasing and 50% chance of staying the same. Eventually the system will reach a state of maximal entropy, and can then only decrease or stay the same. Ultimately, as this game continues, the system will oscillate through all possible states with all levels of entropy (with each of the states essentially being favored by entropy).
This is great as an analogy, but like the frequently encountered deck of cards analogy, it can easily give a wrong impression.
Just changing the position or orientation of macroscopic objects from an arrangement that we define as orderly, to one that we define as disorderly doesn’t necessarily change its entropic state. Does turning a coin over alter the number of microstates accessible to its constituent particles?
Of course, the act of flipping the coin involves an increase in entropy, but this change is inherent in the "flipper", not in the coins as individual objects, or as a group.

Entropy is a confusing concept. That as it goes from order (low) to disorder (high). If we take the heath death presumed for the universe, then that should be a total disorder (aka a thermodynamic equilibrium). Thermodynamical equilibrium sounds a lot better to me than 'total disorder' semantically.

There’s another point from "Does time stand still in the quantum world? I think is worth a mention".
Imagine four coins that can be either H or T. At regular intervals a coin is selected at random, and flipped such that it has a 50/50 chance of being H or T.
If the system starts out HHHH (the lowest possible entropy for the system), then after the first flip, entropy has a 50% chance of increasing and 50% chance of staying the same. Eventually the system will reach a state of maximal entropy, and can then only decrease or stay the same. Ultimately, as this game continues, the system will oscillate through all possible states with all levels of entropy (with each of the states essentially being favored by entropy).
This is great as an analogy, but like the frequently encountered deck of cards analogy, it can easily give a wrong impression.
Just changing the position or orientation of macroscopic objects from an arrangement that we define as orderly, to one that we define as disorderly doesn’t necessarily change its entropic state. Does turning a coin over alter the number of microstates accessible to its constituent particles?
Of course, the act of flipping the coin involves an increase in entropy, but this change is inherent in the "flipper", not in the coins as individual objects, or as a group.
Discounting the change in entropy of the universe caused by the action of flipping the coin, we can still determine the entropy of just the system defined by the coins.
There are 2^{4} = 16 possible states
There is only way to have four heads: HHHH
and analogously only one way to have four tails TTTT
There are four ways to have three heads HHHT HHTH HTHH and THHH
and analogously four ways to have three tails TTTH TTHT THTT and HTTT
then the remaining six states have two of each
HHTT HTHT HTTH TTHH THTH and THHT
so, after 10000 flips, it is most likely that there are two of each, and only a 1/8 chance that they are all the same (1/16 HHHH + 1/16 TTTT)
This system is still small enough that there isn't a huge difference between the highest and lowest entropy states. But if we had 1200 coins, or as someone else pointed out in the other thread, 10^{24} coins, then there is effectively zero chance that the coins would all have the same arrangement, once allowed to evolve from the lowest entropy state for a few turns.

I’m confident that your maths will be flawless, and your reasoning impeccable, but are you talking about entropy or order?
Perhaps you are taking John von Neumann’s advice: “You should call it entropy…..[because] no one knows what entropy really is, so in a debate you will always have the advantage.” :)

The fact that von Neumann was talking to Claud Shannon does raise a question as to the degree to which thermodynamic entropy and information entropy have become “entangled”.
https://en.wikipedia.org/wiki/Entropy_(information_theory)

Entropy is a confusing concept. That as it goes from order (low) to disorder (high).
I’m posting this because I’m trying to clarify a point in my mind, and would welcome comments. (The number of objects is not significant).
Take 10 identical objects. Place them in a neat, straight line. (order).
Move these objects about such that they are randomly positioned. (disorder).
Which has higher entropy and why?
Take 10 identical objects. Place them in a neat, straight line. (order).
Take 10 identical objects. Place them randomly on a surface. (disorder).
Which has higher entropy and why?

Maybe I got no response because the correct word was 'increase' instead of 'decrease'. Can't blame it on dyslexia, just carelessness.The argument is still valid. As long as there are organizing processes in the universe, it will never 'run down'. I've seen entropy defined as a trend toward more uniform energy distribution, i.e. equilibrium, which would more likely occur in a closed system.
A deterministic universe is not possible, since it would require a simultaneous knowledge of its current 'now' state.

Entropy is a confusing concept. That as it goes from order (low) to disorder (high).
I’m posting this because I’m trying to clarify a point in my mind, and would welcome comments. (The number of objects is not significant).
Take 10 identical objects. Place them in a neat, straight line. (order).
Move these objects about such that they are randomly positioned. (disorder).
Which has higher entropy and why?
Take 10 identical objects. Place them in a neat, straight line. (order).
Take 10 identical objects. Place them randomly on a surface. (disorder).
Which has higher entropy and why?
They may all have the same energy.
Shuffle a deck of cards out of the pack. The cards have an order, it's just not the one matching the standard definition.
Let's move ahead with a set of atoms at approx. 0 Kelvin. Near zero energy, but having a fixed arrangement, highly ordered.

If entropy is increasing universally, it is proceeding to equilibrium.
The universe at present is proceeding towards equilibrium with the CMBR at 2.7K.
However, in another 14 billion years or so, the CMBR will be down to perhaps 1.3K (depending on what Dark Energy gets up to in the meantime). The universe will then be proceeding towards equilibrium with this new, lower temperature.
This is described as the "Heat Death of the Universe", which, ironically, is currently expected to be cold and dark.
See: https://en.wikipedia.org/wiki/Heat_death_of_the_universe
the cosmic radiation background of the Universe's temperature is remarkably consistent. This to me, maybe not to you, presents an ordered system.
The great consistency of the CMBR temperature across the sky suggests that the early universe was in thermal equilibrium, at the time of last interaction between light and matter, which is thought to be around 300,000 years after the Big Bang.
The theory of cosmic inflation suggests one way that this thermal equilibrium could have occurred.
The very tiny variations in CMBR across the sky have been explained in terms of quantum fluctuations in this thermal equilibrium.
See: See: https://en.wikipedia.org/wiki/Inflation_(cosmology)
I recently read "Just 6 Numbers" by Martin Rees (year 2000), in which he suggested that some of the CMBR variation across the sky could be due to gravitational lensing by distant masses in the universe, effectively magnifying parts of the CMBR. I can see that this would increase the intensity of the CMBR, but should not change it's temperature? Is this still a current theory?

Maybe I got no response because the correct word was 'increase' instead of 'decrease'.
I meant to query this but lack of time, and an influx of responses deflected me.
#2 should now read: “How can entropy increase on a universal scale, while having structures (atoms, planets, stars, galaxies, plants, life forms, etc) being formed by organizing entities (gravitational, chemical, nuclear, DNA, etc)?”
Possibly the answer is that entropy can, and does, decrease locally; but the mechanisms of such decreases give rise to globally increased entropy.
These days entropy is firmly linked to atomic theory and statistical physics, but engineers were using the concept before the scientific establishment in general believed that atoms were more significant than any other Greek “mythology”.
Shuffle a deck of cards out of the pack. The cards have an order, it's just not the one matching the standard definition.
Precisely, their order makes no difference to their thermodynamic entropy, but global entropy is decreased by the action of shuffling.
Let's move ahead with a set of atoms at approx. 0 Kelvin. Near zero energy, but having a fixed arrangement, highly ordered.
I would see this as a “system” that (locally) has very low entropy. Presumably, you were thinking in terms of a process by which the temperature of this “set of atoms” was reduced to “near zero energy”. This process must lead to globally increased entropy.

Phyti " A deterministic universe is not possible, since it would require a simultaneous knowledge of its current 'now' state. "
Well, you can. Just allow all possibilities statically. Forget 'time' for it. Then allow for a arrow and decoherence, scaling it up. Also allow for HUP and 'free will'. Now you got both, your local arrow combined with your free will, redefining your reality in a dynamic manner. 'It's not 'deterministic' in a usual sense, although you can see it as all 'states possible' already are 'known'. The beauty of it is that I don't need to introduce 'new universes' for each 'probability possible', I do it the opposite way, letting your free will create reality, with the rest being 'probabilities' unfulfilled.
You could possibly see it as a result of so called 'super positions'. If one assume that 'systems' unobserved are in superpositions then those include all possibilities too. the definition of a outcome is another thing, but as long as those superpositions exist the system is open for interpretations, no matter what the question or/and circumstances might be. With the probability of a outcome defined by the circumstances surrounding 'the experiment/question' etc.

In the universe we have families of galaxy types. These include elliptical and spiral galaxies. On this scale we see order. That is irrelevant. There is disorder in the detail. In other solar systems we are finding planets that continue to defy expectation. There is a greater variety than expected. So entropy may really be a localised phenomena which is hidden from view on a global, universal scale. Much as the quantum world is hidden from our macroscopic view. The devil really is in the detail.

galaxy types... These include elliptical and spiral galaxies.
Spiral galaxies have a somewhat orderly motion within the galactic disk  and somewhat less orderly in the central bulge.
 This is because, over time, stars which have high velocity or orbit well outside the galactic plane (eg captured dwarf galaxies) have their motions averaged out. This averaging of angular momentum means that more mass ends up in the galactic bulge (and ultimately, in the central black hole).
Elliptical galaxies have much less order in the motion of their stars (higher entropy)
 Some astronomers have suggested that elliptical galaxies could form from the recent merger of two large spiral galaxies ("recent" in astronomical terms)
 This would throw the two orderly motions into chaotic motion
 It is only with time that the random motions of the individual stars average out
 Some stars will plunge close to the new galactic core (and perhaps feed the new, merged galactic black holes)
 Others will be thrown out of the galaxy entirely (where is it hard to see them)
Such a merger is on the horizon for our Milky Way galaxy, and the larger Andromeda galaxy, in about 5 billion years.
This chaotic stage during the merger makes it hard to timereverse a galaxy merger as a thought experiment
 If you could measure the position, velocity and mass of all the stars of a merged galaxy (perhaps not possible if they have been swallowed by the central black hole, and not easy otherwise)
 And try running time backwards in a simulation
 I doubt that you will end up with two neat spirals!

Just allow all possibilities statically. Forget 'time' for it. Then allow for a arrow and decoherence, scaling it up. Also allow for HUP and 'free will'. Now you got both, your local arrow combined with your free will, redefining your reality in a dynamic manner. 'It's not 'deterministic' in a usual sense, although you can see it as all 'states possible' already are 'known'. The beauty of it is that I don't need to introduce 'new universes' for each 'probability possible', I do it the opposite way, letting your free will create reality, with the rest being 'probabilities' unfulfilled.
Doesn't this result in a reality that exists only in the mind? Dorothy Rowe maintains that reality for each of us is the sum total of our brain’s interpretations of the flood of external stimuli it constantly receives, but she's a psychologist. Should we be looking for some objective reality?

Bill S #20;
I would see this as a “system” that (locally) has very low entropy. Presumably, you were thinking in terms of a process by which the temperature of this “set of atoms” was reduced to “near zero energy”. This process must lead to globally increased entropy.

This example was intended to remove any association of entropy with order. Clausius (1800's) is credited with the 2nd law of thermodynamics which defines entropy as 'a measure of available energy'. It applies to all forms of energy.
A star is formed from gravitation acting on a cloud of dust/gas. Isn't this a case of a diffuse low entropy object transforming to a high entropy object?
A seed is planted. In the presence of water and sunlight from the previous concentrated source, genetic code instructs the seed to form a structure, which when harvested, serves as an energy source for a life form.
This shows entropy simultaneously changing in both directions, globally and locally.
For a finite universe to 'run down', would require removal of the known forces of physics, which are organizing matter into concentrated systems of energy.

yor_on #21;
There are two things of interest.
1. What is the state of a system.
2. Do we know the state of a system.
Probabilities are weighted values from analysis of statistics (which are historical).
They are used for predictions, when causes are unknown (eg. particle physics) or too complex to calculate (eg. weather).
They are useful (eg. astronomical observations) only because the laws regulating the behavior of the universe are consistent.

Knowing the state of a system requires observation. If you are observing an object 100 ly distant, your knowledge of its state is outdated. You don't even know if it's still in existence. (The tourist pulls over to ask for directions. The old man replies: 'it's down the road, where the old school house used to be.')
Because light speed is finite, the age of your data depends on distance.
If you flip a coin, in the air it can be considered a superposition of HT. It doesn't have a definite state until it lands, is observed.

Bill S #24;
Doesn't this result in a reality that exists only in the mind?

Relativity has shown perception is reality confined to the mind.

However, there are many examples of open systems with increasing entropy (like the cup of coffee on my desk into which the added cream is dispersing).
I’ve just noticed this, and find myself wondering about it.
Your cup of coffee constitutes an open system. Heating it increased its entropy. What happens to the entropy of this system when you add cream?
The temperature of the cream is raised, so its entropy increases, but this lowers the temperature of the coffee, decreasing its entropy. Thus, the entropy of the “system” decreases. (?)