Entropy and Cosmology

Last time I started talking about entropy and a puzzle it presents in cosmology. To understand the puzzle we have to understand entropy, which is a crucial part of our view of physics. In fact, we consider entropy to be a (statistical) law about the behavior of reality. That law says: Entropy always increases.

There are some nuances to this, though. For example we can decrease entropy in a system by expending energy. But expending that energy increases the entropy in some other system. Overall, entropy does always increase.

This time we’ll see how Roger Penrose, in his 2010 book Cycles of Time, addresses the puzzle entropy creates in cosmology.

Previously I talked about air molecules in a room. For another example, now I’ll look at the CD Collection example in light of the more formal definition presented last time.

[The CD Collection example is from my Barrel of Wine; Barrel of Sewage post, which you should probably read if you haven’t.]

For reference, here’s the entropy formula again:

(See the previous post for details.)

§ §

The macro-state of the CD collection is how sorted it is. The micro-states are the locations of each CD (first, second, third, etc).

If every CD is in the correct place, the sort is perfect, and the entropy (relative to being sorted) is zero.

Per the entropy formula, we have: K log 1. Omega (Ω) is 1 because there is just one arrangement of CDs (of micro-states) that is perfectly sorted. The log of 1 is 0.0, and K×0.0 is still 0.0, so the entropy of a perfectly sorted CD system is zero. (Relative to being sorted!)

Suppose we take one CD and misfile it. We’ve actually introduced two errors: [1] there is a CD missing from where it should be; [2] there is a CD inserted where it shouldn’t be.

If there are N CDs in the collection, then there are N configurations with a single CD missing. Given a CD in hand, there are N locations it could be placed (including the correct slot).

That makes N2 arrangements of the CD collection with a single CD out of place. If we have 1,000 CDs, there are 1,000,000 such arrangements.

Per the formula, we have K log 1000000, which is K×6.0.

So in moving from the perfect sort to a single misplaced CD, the entropy of the collection jumps from K×0.0 to K×6.0

(Experts: I’m going to leave it at that, because I’m being a little shameless with logs. In fact the formula uses the natural log, but we can accommodate that by modifying Boltzmann’s K. For now just focus on the relative difference between the entropy numbers calculated here.)

(Also: Since K is constant and always present, from now on I’ll just use the numbers. So in this case we say entropy goes from 0.0 to 6.0 when there is just one misplaced CD.)


The worst case is the earthquake that knocks the CDs off the shelves, leaving a random pile on the floor. Zero sort order!

For 1000 CDs, there are 1000! (one-thousand factorial) possible arrangements. The resulting number has over 2,500 digits.

Per the formula: K log 4.02×102567 = K×2567.6

So the maximum entropy of the 1000-CD collection is roughly 2568 — much larger than the entropy of a perfect sort or one CD out of place. (This is the same level of entropy as the 125 “gas” molecules in the two-inch cube with 5-per-inch fixed spacing.)

The maximum entropy of the sort order of just 1000 CDs isn’t very large. (As I showed you last time, when dealing with real world particle systems the numbers are huge beyond comprehension.)

Even so, there is still considerable difference between the one-million micro-states of the “one CD misplaced” macro-state versus the 2568-digit number that numbers the micro-states of the “random pile of CDs” macro-state.

It’s easy to see how shaking the pile is not at all likely to result in accidental sorting (although it’s not completely impossible).

§ §

Turning at last to cosmology, we can now see exactly why black holes are the source of most of the universe’s entropy.

One might think black holes should have low entropy because they’re so simple. They have no hair; they only have mass, charge, and spin.

But that’s exactly why they have such high entropy.

Consider the vast number of atoms in a large star that ends its life by collapsing into a black hole. Prior to the collapse, the star’s macro state includes all its properties: the light it emits, its magnetic field, the stellar wind, flares, etc.

While there are a vast number of micro-states for the star’s particles — and hence a fair amount of entropy — the star’s macro-states are complex. There are comparatively fewer arrangements that produce the macro-state.

After collapse, all those micro-states no longer affect the properties of the black hole, except for its mass, charge, and spin. All those micro-states become irrelevant resulting in a massive increase of entropy.

The supermassive black holes in the hearts of galaxies contain the micro-states of millions of stars reduced down to three properties: mass, charge, and spin.

Most of the entropy of the universe is in these supermassive black holes.


Entropy, to me, seems the result of the behavior of particles plus time. Some believe time arises from entropy, that entropy is the fundamental property. It’s a variation of the notion that change is fundamental and that time arises from change.

I do not find these notions persuasive. They are in some part motivated by another odd notion: that there’s no time in physics. (Which, in turn, seems to come from the idea that physics works forwards and backwards, rather than a genuine perceived lack.)

The thing is, one is hard-pressed to find any physics equation that does not, in some way, contain the ubiquitous t (for time) variable. Time is very much a fundamental notion in the mathematics of physics.

Kant saw time as one of our most fundamental intuitions, even more so than our other fundamental intuition, space. (And how interesting that many years later Einstein would knit them into fundamental spacetime.)

FWIW, I believe time is fundamental and axiomatic to reality. Time just is. It may be more fundamental than space in that time probably existed before the big bang (which is where our space began).

The point of which is that big crunch scenarios, which posit the eventual end of cosmic expansion and the resulting collapse back to singularity, sometimes suggest time runs backwards during collapse. This is (supposedly) because time arises from entropy, and such scenarios predict a reverse of entropy.

Penrose suggests, and I agree, this is very unlikely. It is far more likely entropy will, as always, and as the Second Law requires, continue to increase. Time will certainly not rewind.


One thing Penrose points out is that, by the time the universe stopped expanding and started collapsing, there would be lots of black holes.

As the universe collapses, those black holes merge to form ever larger black holes. As the collapse approaches the singularity, the universe consists of a churning mass of black holes.

This is nothing like the astonishingly smooth cosmic expansion resulting from the big bang. The big crunch will not be its mirror image, but something else entirely.

On this count alone, we see the collapse of the universe — should it happen — must involve a massive increase of entropy as black holes merge.


The last of Penrose’s points for now the above analysis of black holes is a clue to how the early universe was low entropy.

We see that black holes increase entropy when they swallow matter. Since black holes are a phenomenon of gravity, perhaps perhaps that’s the clue we need.

[As an aside, every form of energy ultimately is due to gravity. The reason the universe gets away with big banging from nothing is that expansion against gravity acts like stretching a spring — it puts energy into the spring. Or, in our case, matter and energy into the universe.]

What characterizes the early universe from the later one is the clumping of matter due to gravity.

Penrose’s argument suggests that the even distribution of matter in the beginning implies gravity was “switched off” (as Penrose puts it) at first. Gravity kicked in later and the diffuse cloud of hydrogen, helium, and a dash of lithium, began to collapse into the first stars and galaxies.

Given the canonical air-molecules-in-a-room example of entropy, it’s hard to see the diffuse early universe as the low-entropy state and modern clumpy galaxies as the higher. Structure doesn’t seem “disordered.”

The difference is that gravity plays no role with gas molecules. The situations aren’t analogous. In the air, the diffuse state is the macro-state with the most arrangements. Squeezing the air into a cube is an unlikely state, so naturally the evolution of the system is towards diffusion.

But with gravity, the diffuse state can’t survive. Any disturbance or imbalance starts collapse towards clumping. And the number of micro-states that promote those changes is much larger than what amounts to the sorted perfect state of diffusion.

§ §

I may never understand Penrose’s comformal cyclic cosmology (CCC) idea — let alone be able to judge it — but the book is a success for me just on the entropy discussion.

There’s a lot I didn’t go into, but Penrose’s treatment was one of the most thorough and nuanced that I’ve read. And, being it’s Penrose, most mathematical. Say what you will about his metaphysics, but it is grounded in hard math.

Stay entropic, my friends!

About Wyrd Smythe

The canonical fool on the hill watching the sunset and the rotation of the planet and thinking what he imagines are large thoughts. View all posts by Wyrd Smythe

26 responses to “Entropy and Cosmology

  • SelfAwarePatterns

    I know Penrose was a proponent of the big crunch scenario, but I thought conformal cyclic cosmology was something different, something to do with successive aeons with the big bangs being transitions between them. (That said, I totally don’t understand it either, and other than the news articles when he published it, haven’t read anything about it.)

    Thanks for the black hole entropy explanation! Makes sense, and fits with my little transformation crutch since the ability of black holes to transform is very limited.

    Every time I see the entropy equation, I’m reminded of the similarities between it and Shannon’s equation to measure information content, although he uses log base 2.

    • Wyrd Smythe

      CCC is about successive aeons. I touched very briefly on it in the previous post but didn’t really pick up that ball here, sorry! Mentioning the big crunch here was channeling Penrose’s debunking of big crunch scenarios, at least those that suggest a rewinding of the big bang. If a crunch happens, it will be quite different than a reversal of the bang.

      His CCC idea requires the expansion to continue. He rough-guesses 10100 years, which is far beyond many cosmological views (trillions or many trillions is common). At that point assuming dark energy is constant, the expansion rate will be huge. In fact, he posits that expansion is what looks like cosmic inflation in the next aeon. (He’s skeptical about cosmic inflation.)

      CCC further requires black holes all evaporate, releasing low-energy photons (Hawking radiation). It also requires any massive particles losing their mass, something Penrose speculates as at least possible that far down the line. He points out that it looks like mass came after the big bang, so it would be fitting if mass evaporates at the end of the aeon, too.

      Once the universe consists of nothing more than photons and gravity interactions (which have massless bosons), time effectively ceases, since massless particles have no clocks. This timeless, massless, rapidly expanding empty universe somehow has a conformal map to a new aeon, and that mapping looks like a big bang to us.

      His image is almost of a bamboo tube. Segments separated by bulkheads.

      Understanding the conformal map business requires tensor math, which is still way over my head. I’m still trying to figure out those diamond-shaped spacetime diagrams he uses. (If you saw the Matt O’Dowd videos about black holes, you saw the kind of diagram I mean.)

      “Makes sense, and fits with my little transformation crutch since the ability of black holes to transform is very limited.”

      LOL! If it works, it works!

      Regarding Shannonn’s equation, it’s based on Boltzmann and other thermodynamics views of entropy, so it’s not surprising it looks similar. The key similarity is using the log of micro-state values. Shannonn uses base 2 because his whole view of information reduces all information to bits, but logs convert. Boltzmann used the natural log. Penrose started his discussion with log base 10 because it’s just counting zeros of numbers people understand. I used it for the same reason, but it was playing a little fast and loose.

      • SelfAwarePatterns

        With CCC, the interesting question is, can any information survive the transition? Sounds like it would be tough. Although if bosons survive, maybe patterns could survive? Might we find messages buried in the CMB?

        I’ve seen 10^100 thrown around before for the timeline of black hole evaporation. When looking at the ultimate fate of the universe, the numbers can get crazy. I think Sean Carroll said a new big bang could happen spontaneously due to quantum fluctuations in about 10^10^10^50 years. (I may have missed a level, I’m going off memory.)

        From what I understand, Shannon worked out his equation independently, only noticing afterward that he had converged on entropy. He was advised by friends to play up the entropy connection when selling his theory.

      • Wyrd Smythe

        Oh, okay, didn’t know that. I guess I just assumed. I’ve never really dug into information theory.

        Wow, it’s hard to imagine what 10^10^10^50 amounts to. That’s a mind-boggling big number.

        Penrose speculates that gravity interactions, specifically the merging of black holes, could cause gravity waves that might make their presence known in the CMB. He arranged to have some analysis done with “intersting” results so far. There was something else involving photons, too.

        The gravity waves expanding out from their source apparently would leave faint circles of higher or lower energy in the CMB. Very hard to tease out, but that’s what he tried. I don’t remember it very well, his CCC stuff got very mathematical and I ended up skimming a lot of it. If I owned the book, or if I check it out again, I’d focus more on those sections. I’m not sure it would matter until I learn more about tensor math.

      • SelfAwarePatterns

        On the mind bogging number, yeah, at some point it just becomes notation, no longer anything we can even pretend to conceptualize.

        If the CCC math is beyond you, it’s definitely beyond me. Penrose needs a Brian Greene to come in and give a laymen’s explanation.

      • Wyrd Smythe

        Ha! As science explainers, Greene and Penrose are on rather opposite ends of the spectrum! Penrose does seem to be writing for a much narrower audience.

        Thinking about it,… Greene’s mission is talking about science to non-scientists, but I think maybe Penrose is just writing about his work and ideas in case anyone cares. And ya kinda have to to wade through his books. (At least the two I’ve read. His interviews and talks usually take it down a notch. I would like to check out some of his other books. Fashion, Faith, and Fantasy sounds really interesting coming from him.)

      • SelfAwarePatterns

        Scott Aaronson, when Sean Carroll asked about his book, pointed out that it’s not really a popular science book (“popular” in the sense of being aimed at general audiences), although it’s not a scientific treatment either. He said he told the publishers, when asked what the audience would be for his book, that it would be similar to whoever was reading Penrose’s books.

    • Astronomer Eric

      “First applied to the operation of physical systems, entropy is a measure of disorder. But the very same principles of entropy that apply to physical thermodynamic systems, such as self-organization, apply to all information-processing systems, including the brain, nervous system, and psychological processes of humans. All biological organisms—including humans—survive insofar as they are able to effectively manage internal entropy.”

      Excerpt From
      Scott Barry Kaufman

      I just read this and immediately remembered this comment! This book is really good! I highly recommend it!

      BTW, is it bad form to post a quote like that on a blog?

      • Astronomer Eric

        Hmm…I seem to be having trouble replying to the right comment today. I’m referring to Mike’s earlier comment:
        “ Every time I see the entropy equation, I’m reminded of the similarities between it and Shannon’s equation to measure information content, although he uses log base 2.”

      • Wyrd Smythe

        And you did. Mike’s comment is not a reply, so it’s at indentation zero. My reply is at comment level one. His reply, and the rest of our conversation, is at comment level two, which is all I allow here.

        You correctly clicked the [Reply] link on Mike’s original comment, so you’ve started a new reply thread off that root comment. Your reply is at indent level one, the reply to your reply is level two, the max. Replies at the max indent level have no [Reply] links, so one has to look upstream for the previous indent level.

      • Wyrd Smythe

        Long as it’s not a habit, it’s fine, no problem

      • SelfAwarePatterns

        Hey Eric,
        Sorry, didn’t realize you had responded to me here. Thanks for the book recommendation. I do agree that entropy, life, and information processing are all tangled up with each other. Is that what you were trying to communicate? Or just wanted me to know about the book?

      • Astronomer Eric

        To Wyrd: Oh! Thanks! I wrote the comment on my phone, so it’s hard to see the indentations on it. I also normally just type the quote I’m replying to, but didn’t because typing on the phone is more cumbersome than on the computer. I was just worried that the lack of context in my comment would make it confusing.

        To Mike: Well, I wasn’t trying to communicate a point about entropy and information (since I’m not well versed enough in either field to make a valid comment), except that something I was reading reminded me of the comment you made about Shannon’s equation. That made me excited because much of what I write in your comments section, and here on Wyrd’s, is influenced by Maslow’s theory. This book is an update to Maslow’s theory and is turning out so far to be extremely insightful. I guess I thought you guys might be intrigued that the author was bringing entropy into the realm of psychology.

      • Wyrd Smythe

        Phones, ugh. Reading is fine, but typing is really hard! I can imagine it’s hard to see the indentation on a phone, too. For me, this commenting stuff is best done on a laptop or desktop.

        Entropy is a fundamental law that appears in everything, so it’s not surprising someone would use it in the context of psychology, either in the literal or metaphorical sense.

  • Deal

    I enjoy your blog. Thanks for wading through Penrose for us.

    As a chemist, I am of course firmly grounded in seeing entropy in the thermodynamic fashion, but it is insightful to see it in these other ways. When people use entropy to describe Shannon’s information theories, I find that a bit of a stretch (not the theories, but the referring to them as about entropy). Entropy is a measure of disorder, so what is the opposite of disorder? Well, we might say that information is the opposite, so any theory about information is conversely about entropy. But there are many other opposites of disorder—any kind of complexity would qualify—making it untenable to push the information comparison too far.

    I do happen to know the missing link connecting Shannon to the third way of seeing entropy, in terms of black holes. That is via the physicist J.D. Bekenstein who uses the capitol letter “I” to stand for “information” in the negentropy sense of information, and he uses it explicitly for describing black holes. As a computer man, you might be interested in how Bekenstein has argued for the equivalency of information and physical laws. See, for instance,
    Also, Bekennten has a written a popularization of is own work.

    • Wyrd Smythe


      I agree about entropy and information theory. It’s always seemed like an interpretation of entropy to me. I intuitively see entropy as being about arrangements of micro-states, so there’s something of an intellectual jump I have to make seeing it, in a sense, from the other side.

      As you say, it’s the more general nature of entropy that makes it feel odd talking about it with regard to information loss.

      Thanks for the Scholarpedia link. I keep forgetting it exists. For some of this technical stuff, it would be better to link to Scholarpedia than Wikipedia. (Although for technical stuff Wiki isn’t bad.)

      As I recall the Bekenstein bound is associated with the holographic theory Nima Arkani-Hamed has talked about so much. (I happened to re-read his old SciAm article just last week. Going through old issues before tossing them in the recycle.)

      I haven’t looked deeply into physics-as-information views. Despite being a computer guy, my intuition is that there is more to it. As a general rule, I’m not one for strongly reductive views.

      OTOH, Arkani-Hamed’s theory calls for 1060 distinct “colors” (as in QCD color) for particles living on the spherical boundary, which on its own seems to cast doubt on the idea (not to mention the whole anti-de Sitter space aspect). For now I see holographic theory standing next to string theory in the Fever Dreams of Mathematicians zone.

    • Wyrd Smythe

      I was just reading that Scholarpedia link. Very nice!

      This part was cool:

      Note that a one-solar mass Schwarzschild black hole has an horizon area of the same order as the municipal area of Atlanta or Chicago. Its entropy is about 4×1077, which is about twenty orders of magnitude larger than the thermodynamic entropy of the sun. This observation underscores the fact that one should not think of black hole entropy as the entropy that fell into the black hole when it was formed.

  • Wyrd Smythe

    Wow, synchronicity strikes again. PBS Spacetime just published a video about Penrose’s CCC:

  • Wyrd Smythe

    Congrats to Sir Roger Penrose for winning the Physics Nobel Prize!!

    Peter Woit, on his blog commented that it’s unusual for a Ph.D mathematician to win a physics prize. (The real question is why he hasn’t won a Fields Medal.)

    I’ve always had a soft spot for Penrose and general sympathy with his views. Blame him for making me rethink computationalism. 😉

  • Wyrd Smythe

    Here is a speech Penrose gave about the paper he got (half) the 2020 Nobel Prize for:

    It’s kind of cute how he spends the first part talking about the paper that won the prize, but rather quickly pivots to his CCC theory. 😀

  • Entropy Isn’t Fundamental! | Logos con carne

    […] [For more on entropy see: Barrel of Wine; Barrel of Sewage or The Puzzle of Entropy and Entropy and Cosmology] […]

  • The Art of Debate | Logos con carne

    […] of score various thinkers rack up. Even published authors can be involved in the game (Tegmark, Penrose, others with striking and original […]

And what do you think?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: