I’ve been chiseling away at Cycles of Time (2010), by Roger Penrose. I say “chiseling away,” because Penrose’s books are dense and not for the fainthearted. It took me three years to fully absorb his The Emperor’s New Mind (1986). Penrose isn’t afraid to throw tensors or Weyl curvatures at readers.
This is a library book, so I’m a little time constrained. I won’t get into Penrose’s main thesis, something he calls conformal cyclic cosmology (CCC). As the name suggests, it’s a theory about a repeating universe.
What caught my attention was his exploration of entropy and the perception our universe must have started with extremely low entropy.
As it turns out, this creates a bit of a puzzle, and there’s an aspect to it I’d never considered before. The puzzle comes from our understanding that, because entropy always increases, it must have been lower in the past.
We have similar logic regarding the big bang: The universe is expanding now, and appears to have been expanding in the past. Therefore it must have been smaller long ago.
We see that entropy always increases — so consistently so that we recognized it as law. Therefore it must have been less long ago.
But there is a conundrum posed by the cosmic microwave background (CMB). It is seen as having extremely high entropy in virtue of its smooth distribution. That high-entropy distribution is a key argument in favor of cosmic inflation (for which CCC provides an alternate explanation).
What’s more, the primal big bang fireball, “infinite” in density and energy, is very much what we think of as a high entropy situation. (It’s been something that’s long bugged me. How is the big bang low entropy?)
Penrose’s CCC is an attempt to resolve the conundrum. He posits a very distant future when all that remains are low-energy photons and gravity waves. The universe would be timeless (photons, unlike particles with mass, have no clock). This apparently has a conformal map to something that looks like a big bang in a rebooted universe.
Further, the highly accelerated cosmic expansion (due to dark energy) in the distant future of the previous universe (and eventually ours) is what looks like cosmic inflation in this universe (and the next).
It’s similar to big bounce theories in being cyclic, but there’s no big crunch, just a continual expansion until nothingness and conformal map (which I totally don’t understand) to a new big bang.
§ §
Whatever. Speculative metaphysics is interesting, but it’s ultimately a form of diamond-hard science fiction (which, admittedly, I love).
I can say that I haven’t found anything particularly objectionable about the idea (such as I have with the BUH, the MWI, or the MUH). All cosmologies require basic axioms, things that just are, and I find a physical, single, evolving, even repeating, universe more plausible than those three.
Penrose is clear about how speculative this is, but he does provide some avenues that might falsify or support his theory. He’s pursued some of them with what he considers interesting results.
It would take me a lot more readings, and probably learning more about tensors, to say much more about his conformal cyclic cosmology. Instead, since the part of the book that really grabbed me was about entropy, I thought I’d write about that.
§
If you search my blog for “entropy” you’ll get quite a few hits. I’ve mentioned it now and then, because it’s so fundamental to everything.
If you click the entropy tag, you get four posts that discuss it in detail (not including this one). The earliest of those is one of the first posts I published here: Barrel of Wine; Barrel of Sewage.
It gives, what I still think is, a pretty good informal overview of entropy. You should read it if you haven’t, since I’ll refer to the CD example. Today I’m going to give you a formal definition.
§ §
Entropy defines differently depending on the domain. In communications (information theory), it focuses on errors in data. In computer science, it can refer to randomness for cryptography.
Those are specialized views. The most general English-based definition is that entropy is a measure of disorder. This definition is so general it drives most physicists a little crazy (because: define “disorder”… and “measure”).
A more precise English-based definition is more opaque: Entropy is the number of indistinguishable micro-states consistent with the observed macro-states. Clear as a brick wall, yeah?
I’m going to try to knock down that wall. If I’m successful, you’ll have a basic understanding of the formal mathematical definition of entropy:
Which, by the way, is what is sometimes known as Boltzmann entropy to distinguish it from other versions. For now just know that S is the entropy (and B is for Boltzmann).
§
Let’s go back to that second definition: Entropy is the number of indistinguishable micro-states consistent with the observed macro-states.
A macro-state is a property we can measure in a system. A macro-state is an overall property of the whole system, such as its temperature, pressure, or color. In the example of the CD collection, how well it’s sorted is a property of the whole collection.
A micro-state is a property of a piece of the system — usually, as the name implies, a small, even tiny, piece. The molecules of a gas or liquid all have individual properties, such as location and energy. Each CD in the collection has individual properties, such as its index in the set. (There’s a first, second, third, and so on.)
We say that micro-states are indistinguishable when different arrangements of micro-states have no effect on a given macro-state.
Consider air molecules spread-out evenly throughout a closed room. The air has a pressure that’s the same for an unimaginable number of different arrangements of those spread-out molecules. So long as the molecules are evenly distributed, the pressure is the same.
So vast numbers of molecule arrangements are indistinguishable in terms of the air pressure. That’s high entropy.
Now suppose a magic spell suddenly compresses all the air in the room into a tiny one-inch cube in the corner. There are still many ways to arrange the molecules in the tiny cube, but far fewer than in the whole room. What’s more, if the room is empty, then even one molecule outside the cube is distinguishable.
The result is a much smaller set of indistinguishable micro-states and thus low entropy.
§ §
In an average room, the number of air molecules is roughly 1025. The number of ways that many molecules can be arranged in a one-inch cube is a number beyond comprehension. It’s basically the factorial of that 25-digit number of particles!
Since these numbers are beyond understanding, let’s take it way (way) down. Let’s assume there are only 125 air molecules (they are very large) and they fill the one-inch cube. We’re saying there are five molecules per inch, and the cube fits 5×5×5=125 molecules. Think of it as a little box of ball bearings.
It has 125! (factorial) = 1.88×10209 possible arrangements.
Suppose we expand the cube to two inches on each side. This increases the volume by eight (two cubed). Now there are 10×10×10=1000 locations for 125 molecules and 875 empty spaces.
This has 1000! = 4.02×102567 possible arrangements.
Quite a bit more. Going from a one-inch cube of 125 molecules to a two-inch cube increases the number of micro-states by over two-thousand orders of magnitude.
My calculator isn’t capable of doing 3375!, which is the number of possible arrangements for a three-inch cube. If the exponent increases similarly, it’s over a 30-thousand digit number.
This is with only 125 molecules in a three-inch cube. Imagine how huge the numbers get with realistic numbers of air molecules and the full size of the room.
(The problem is we can’t imagine numbers like these. As we’ll see below, we’ll get a slight assist in dealing with such huge numbers.)
§
This vast increase in possible arrangements is why entropy always increases.
Given the freedom to roam the entire room, the odds of all the molecules finding one of the low-entropy arrangements, while possible, has a probability that is effectively zero. Generally speaking, the odds of finding any lower entropy configuration are extremely prohibitive.
Here’s the entropy equation again:
Now we can make sense of the other parts.
The KB is a constant, a fixed number required to make the equation work.
(Mathematicians often use K for “konstant” and the B refers to Boltzmann, who came up with this formula. The value is known as Boltzmann’s constant.)
The Omega (Ω) is the number of micro-states of the system in question. As just discussed, it’s generally an extremely large number. (Remember that it’s the number of configurations of particles, not the number of particles.)
Which is why the log operator is there. Vastly over-simplifying, the log of a number is how many zeros it has. (For instance, the log of 1000 is exactly 3. The log of 1 is exactly 0.)
Using the log of the micro-states number makes comparing the entropy of systems more reasonable. Instead of the unimaginably large numbers involved, we can compare their logs.
In the 125-molecule example, the one-inch cube had Ω = 10209, so its log is 209. The two-inch cube had Ω = 102567, so its log is 2567. Therefore the entropy difference is 2567 – 209 = 2358.
If the three-inch cube does have Ω = 1030000+, then its entropy would be 30,000+. Even in our toy model, the entropy numbers get large. In a realistic model, the entropy numbers get astronomical (the configuration space numbers, as I said, are beyond comprehension).
(Note that I’ve seriously papered over certain aspects of logs.)
§ §
Entropy, and the second law of thermodynamics, work because of the behavior of micro-states plus time.
That entropy always increases is a statistical law, but a powerful one. It applies most strongly to large systems with lots of tiny pieces, but it’s a lurking presence in all systems.
When I pick this up again, it’ll be to explain (according to Penrose) the puzzle of the necessity of a low-entropy past versus the apparent high-entropy reality of the big bang.
Stay entropic, my friends!
∇
June 8th, 2020 at 4:35 pm
Good description of entropy!
The “disorder” description always bugged me. It seemed subjective and value laden, a strange attribute for a physics concept. Once I understood the concept a little more, I understood why physicists sometimes use it as a quick description, but it remains an odd one for me.
Brian Greene, in his discussion of entropy and the big bang, identified gravity as the piece that takes what appears to be a high entropy situation and makes it a low entropy state. Gravity enables substantial transformation from that point. But without it, it would have just been a high entropy state that just continued to be thinned as space expanded.
And that, to me, has always been what distinguished low from high entropy: how much transformation is latent in the state. High entropy states seem less primed for transformation. I do realize thinking of entropy this way borders on tautological, but it helps me.
June 8th, 2020 at 5:57 pm
Thanks! Seems like so many physics books cover the topic that I’ve read lots of examples. I have to say, Penrose’s treatment was impressive. It made the book worth reading.
“Brian Greene, in his discussion of entropy and the big bang, identified gravity as the piece that takes what appears to be a high entropy situation and makes it a low entropy state.”
Exactly! I cover that in tomorrow’s post. Along with why black holes have high entropy.
“High entropy states seem less primed for transformation.”
Depending on what you mean by “transformation” it can be read as stating exactly what’s going on. A high-entropy state transforming to a lower-entropy state isn’t forbidden, just really unlikely. Often with odds approaching effective zero.
An abstract “visual” is of the (very many dimensions) phase space representing system behavior. The system can be seen as tracing a curve through that space. That space can be divided into parcels (multi-dimensional volumes) representing the macro-states of the system. The size of those parcels depends on the number of micro-states for a given macro-state parcel.
The parcels for very low-entropy states are very small compared to those for higher. The higher the entropy, the larger the parcel.
The system traces an essentially random path through the phase space. (Not actually random, it’s physically determined, but that path is not generally predictable, which makes it random seeming.) The odds of the path going from smaller parcels to larger ones is much greater than the odds of it finding smaller parcels. Once it’s in the largest parcel, the highest entropy state, the odds of it finding even noticeably smaller parcels is extremely low. With the very small parcels, effectively zero.
So if you read “transform” as moving along the phase space curve and “latent” as the odds greatly favoring larger parcels, that’s pretty much exactly the formal definition.
June 8th, 2020 at 6:52 pm
Oh lord, there you go again. Making me cross my eyes.
June 8th, 2020 at 7:52 pm
Do not operate heavy machinery or attempt to fly airplanes for 24 hours!
June 8th, 2020 at 8:27 pm
Stop it!!! 🤣 🤣 🤣
June 8th, 2020 at 8:51 pm
Not without a safeword. 😛
June 8th, 2020 at 9:09 pm
Hmm…🤔 “Ocelot.” Yes. Ocelot.
June 8th, 2020 at 10:02 pm
Ocelot it is.

Back when I picked mine, I tried to come up with a word that I was never, in those circumstances, likely to utter. I picked “Nixon”. 😐
June 9th, 2020 at 12:23 am
—“But there is a conundrum posed by the cosmic microwave background (CMB). It is seen as having extremely high entropy in virtue of its smooth distribution. That high-entropy distribution is a key argument in favor of cosmic inflation (for which CCC provides an alternate explanation).”
and then:
—“This vast increase in possible arrangements is why entropy always increases.”
Is this implying that if space was contracting that entropy would be decreasing?
Also, still working over our conversation. I’m trying to figure out how best to slow it down so that we aren’t talking about so many different things at once. My hunch is that we should start with emergence since that’s my least understood part. But I’ll need a few more days or so before starting that one back up. 🙂
June 9th, 2020 at 12:54 am
“Is this implying that if space was contracting that entropy would be decreasing?”
It could, but it depends on whether the number of arrangements changes.
If the same particles have the same number of arrangements, just in a smaller overall space, then entropy for that system remains the same. All we’ve done is change the scale of the system. If we magnified our view of it, it would look the same as before.
If the reduced space means fewer arrangements, then entropy drops.
In the gas cubes example, the actual space is always the whole room. The number of arrangements in the cubes is relative to the number of arrangements in the whole room. Or relative to the number of arrangements in bigger cubes. That’s a different situation than contracting the room to a smaller size.
I’m fine with emergence and taking it one-at-a-time. See ya whenever!
May 27th, 2021 at 7:29 am
[…] more on entropy see: Barrel of Wine; Barrel of Sewage or The Puzzle of Entropy and Entropy and […]
November 27th, 2021 at 7:27 am
[…] I’ll start by quoting myself from a post last year: […]