
particles & their momenta
Over the decades I’ve seen various thinkers assert that entropy causes something — usually it’s said that entropy causes time. Alternately that entropy causes time to only run in one direction. I think this is flat-out wrong and puts the trailer before the tractor. (Perhaps due to a jack-knife in logic.)
The problem I have is that I don’t understand how entropy can be viewed as anything but a consequence of the dynamical properties of a system evolving over time according to the laws of physics. Entropy is the result of physical law plus time.
It’s a “law” only in virtue of the laws of physics.
I’ve posted about this before (see: Thinking About Time and Time and Thermodynamics), so I won’t repeat myself (at least not much). This post is more about a little project I had some fun with this past week. As well as being its own goal for this post, the project is a step on the way to a related bigger project I’ve had in mind for some time.
I’m not as visual as some visual artists I know (just like I’m not as musical as some musicians I know), but I am visual (and musical), and I enjoy things like Mandelbrot zooms, visualizations of John Conway’s Life game (and myriad related videos), along with other animations of mathematics.
It’s not just the randomness of the visual, although I do also enjoy clouds, water waves, campfires, and other random visuals. (Ever watched the leaves of a distant Aspen shimmer in the wind? Awesome!) It’s also thinking about the math behind the animation. As I’ve written about before (more than once), the Mandelbrot is especially mind-blowing in that regard.
§
Getting back to entropy, I’ve always liked those animations of particles that start off with all the particles concentrated in one corner and show how over time they spread out to fill the space uniformly.
I think it’s fun watching the little balls bounce around, and it’s interesting to focus on one to see what happens to it over time. Unfortunately, those animations are can be short, small, of bad resolution, drawing, or execution. Which gets me thinking, “Hmmm, I wonder if I could do something nice?”
Something like this:
And, hmmm, it turns out I can.
Surprisingly, it only took a few days to create. I would have finished sooner, but the first collision detection algorithm, in an attempt at being very fast, turned out to be very problematic (registering a single collision as multiple events due to overlap).
Doing actual distance calculations is slower, but I realized I don’t care about the true distance, only whether that distance is below a certain value. That means I can skip doing the square root in the distance calculation:
Which does speed up the code. (And doesn’t affect the animation — just the time it takes to create the frames.)
That worked great once I killed an amusing bug in the collision handling routine. In creating the vector between the colliding particles, I used the absolute delta values, so the vector always pointed into the (+,+) quadrant. But it needs to point in any direction, which requires true delta values, not absolute ones.
What was amusing was that the vector is the basis for moving the two particles outside of collision range. Since it incorrectly always pointed in the same direction, it only pointed in the desired direction in 25% of the collisions. In another 25% it pointed in the exact wrong direction, so particles moved towards each other, putting them even deeper into collision range. (The final 50% jumped sideways.)
Fixing the vector was one of those tiny bits of magic — just removing the two abs()
functions (I’m not sure what I was thinking when I used them) — that made all the difference. It was a good illustration of the old saw about “for want of a nail…”
(If you are curious, you can see the bug in action in earlier versions on my YouTube channel. Mainly note how some particles seem to couple together. In the version 1.x videos it’s because collision detection registered multiple events so particles react, then react again and again, for how ever many event registered. In the version 2.0 video it’s because of the vector bug. In that one, also note the particles making funny little hops when they collide due to the vector pointing the wrong way. Either way, the result is they stay close and keep colliding.)
§
This project began last Friday evening as I sat idly pondering a note I’d written: “Entropy Isn’t A Force!” It came from having seen yet another theorist putting the entropy cart in front of the time horse. It’s one of those modern “scientific” views I find hard to fathom. I don’t see how entropy can be framed as anything other than an emergent process.
Of course, it’s always possible I just don’t get it, but until someone presents an explanation that frames entropy as fundamental in a way that’s sensible, I’m dubious entropy is anything but an emergent consequence of physics plus time.
In this version I wanted the particles to be all the same color, except for a handful (one in ten is red). I thought that would make it easier to follow the red ones. And it is, but I meant for there to be fewer red ones. During testing I did runs with fewer particles, so I used 1/10, but with 800 particles 80 was more than I wanted. I just forgot to change the setting before the run (and was too lazy to redo it).
Note that there’s also a “golden snitch” although it’s actually a blue particle. Unfortunately, the blue doesn’t contrast well with the gray. I should have used a different color. (Maybe actually golden!)
One thing about these simulations is that they’re frictionless and all collisions are head-on and elastic. Further, there are no glancing blows that partially transfer momentum. With head-on elastic collisions, one simply swaps the momentum vectors. Glancing blows require more involved calculations.

Newton’s cradle also demonstrates the total transfer of momentum.
Think about hitting a motionless eight ball dead on with the cue ball. All the momentum is transferred. The eight ball takes off and the cue ball comes to a stop. They trade momentum vectors.
(See the elastic collision Wiki page for some nice examples as well as a nice particle animation.)
A glancing blow results in new momentum vectors for both that combine and then split the original ones. Both the direction and energy change to new values.
As an example, imagine a barely glancing blow that changes the cue ball’s path only slightly while the eight ball slowly rolls away at an extreme angle.
I didn’t want to bother with that for this, but I will for the bigger project I have in mind. I want to create a physically accurate animation of pool balls. Then I want to create side-by-side animations with the same initial starting conditions — save for one tiny variation — and see how chaos causes the trajectories to diverge. (The animation is otherwise fully deterministic, so the exact same starting conditions would always result in the same animation.)
§
For this project all I needed was particles with a lawful behavior. They didn’t need to reflect real particle dynamics. The point I’m demonstrating is that my simple algorithm results in entropic behavior merely in virtue of applying its rules over time. The entropy it simulates is a consequence.
Recall that entropy is (some constant times) the log of a quantity called Omega (O), which is the number of macro states of a given system. Note that this requires some definition of a macro state, so one argument against entropy being fundamental is that it depends on observer definitions. What constitutes a macro state?
So entropy has a virtual aspect as a measurement. While the videos illustrate entropic behavior, the actual video, and the process of watching it, have essentially the same entropy. The physical entropy of the system far outweighs any actual influence of the video content.
[For more on entropy see: Barrel of Wine; Barrel of Sewage or The Puzzle of Entropy and Entropy and Cosmology]
(By the way, information theory uses entropy a bit differently. I find that use a bit suspect, but that’s a post for another day. The point here is that entropy is a measurement or observation of system behavior, not a driving force.)
This new and improved version has 50% more particles than other versions!
With 1200 particles, it took a good fraction of a day to generate the 15,000 frames. To check for collisions, each particle must check its distance from all other particles. The total number of checks is the partial sum of the number of particles:
The partial sum of 800 is 319,600; the partial sum of 1200 is 719,400. Adding 50% more particles more than doubled the number of checks. The check itself involves squaring the delta-x and delta-y, summing them, and testing them against the collision limit, which isn’t too bad. But particles that do collide require handling that is more involved.
Worth the wait. I find the higher density of particles mesmerizing.
§ §
My view does turn on time — including that it flows forward — being a fundamental property of reality. (I likewise think space is a fundamental property.)
When forming an ontology, something has to be fundamental (unless one’s ontology involves “turtles all the way down”). A key distinguishing feature of an ontology is what it considers axiomatic. In my view, the simplest ontology comes from taking time and space as fundamental properties of existence. They are things that just are.
I find support in Kant’s view that our transcendental idealism likewise has time and space as fundamental aspects of our intuition — frames that define our every thought. I would argue that time and space are so fundamental to our minds because they are indeed fundamental properties of reality.
In contrast, the view that entropy (or in some views, change) is fundamental requires that time be secondary and emergent. (I understand theoretical physicist Carlos Rovelli to take this view.) It requires a definition of entropy (or change) that has no notion of time — no equation with a t variable even implied.
(Regarding “implied”, a simple example: Newton’s F=ma, at first glance, has no time variable, but acceleration (a) is the change in velocity over the change in time. For that matter, velocity is the change in distance over the change in time. It turns out that nearly every physics equation ultimately depends on time.)
((Along the same lines, the very basic physical experiences of velocity and acceleration, as just mentioned, are emergent properties of motion through space and time. Their very definitions reference the passing of time.))
To me, fundamental entropy (or change) seems a more complicated and unlikely ontology.
I think time must be fundamental to provide a context in which the Big Bang occurred. Further, Minkowski spacetime singles out time — it has the opposite sign — from the three spatial dimensions.
§
But that’s just my opinion. Perhaps time and space emerge from the Big Bang and other dynamic laws of physics, but given the mathematics of the standard model, quantum mechanics, and general relativity, I’m dubious.
Time is sometimes said not to exist in physics (mostly because basic physics works the same regardless of the sign of the time variable — i.e works the same forwards and backwards), but I think the truth is that it’s so fundamental and ubiquitous that it effectively vanishes from sight.
These videos are meant to illustrate how entropy emerges from the lawful behavior of the system. The program only knows about the physics (such as it is) of the “particles” — there is no entropy to be found in the code. Entropy emerges in consequence of the (virtual) dynamics.
§ §
Here’s a last video for dessert. During the development process I created some that are another form of the kind of random pattern dynamic art I mentioned at the beginning:
It’s an animation of particle movement with no collision detection (but they do detect the walls). Rather than create a new image for each frame, I kept adding on to the same frame, so the particles leave trails. It’s just kind of a wild thing to watch. The trails give it a 3D feel and depth.
For the curious, I wrote the code in Python (wonderful, wonderful Python) and used the Pillow image creation package to generate the images. I used ffmpeg to create the videos.
The particle videos are ten minutes at 25 frames per second, so the code generated 15,000 frames of 1920×1080 24-bit PNGs. Other code generates a few hundred more frames for the fade in and fade out.
Stay entropic, my friends! Go forth and spread beauty and light.
∇
May 27th, 2021 at 7:35 am
Along the way I created various images that [A] tested my code but also [B] looked attractive enough as their own kind of art. They came in three different flavors as the code developed:
First there was “Rays” when I was setting all the particles to the (0,0) starting coordinate:

Then came “Trails” which assigns random starting coordinates:

Finally I did “Threads” which is very similar to “Trails” but is rendered differently:

In “Trails”, each path is laid down in its entirely, then the next and so on. It’s as if first there was one trail, then another, and so on. The early trails are buried, but you can see all of the last ones.
In “Threads”, the paths evolve together. The early parts of all paths are buried, but you can see the last part of all paths. It creates an effect that looks like woven threads. The “Threads” video comes from this way of rendering.
May 27th, 2021 at 7:36 am
(View the images individually for large size versions.)
May 27th, 2021 at 7:45 am
FWIW, even the idea that entropy is “time’s ratchet” — that it causes time to only move in one direction — doesn’t work. No physical law prevents entropy from going backwards; it’s just an overwhelming statistical probability that it won’t.
(Indeed, under the Many-Worlds Interpretation, there must be an infinite number of worlds where entropy largely turns out to run backwards.)
In my view, time doesn’t need a ratchet. Part of its fundamental nature is that it runs in one direction.
May 28th, 2021 at 11:33 am
Great animations, Wyrd! And lots of interesting threads to pull here. It’s been a little while since I dug into entropy so I’ll probably rediscover thoughts I’ve previously had on this while writing, or after I’ve posted this. But anyway, to me it’s definitely not in any way a force. And I think to assert that it is fundamental is challenging as well. So I think generally I agree with you on those points. Entropy seems, in a sense, to be a characteristic of systems that evolve in time, and like you said, it just seems to “happen” as a result of more fundamental laws doing their thing. That said, it’s very interesting that those fundamental laws doing their thing produces this reliable characteristic, given that those fundamental laws cannot predict this characteristic. (Or can they? I don’t have certainty on that, but was thinking that the notion of ever-increasing entropy couldn’t be derived as being inevitable from more fundamental concepts?)
Okay, so we have these basic laws of physics that mirror perfectly in time–at least the classical ones, and the wave equation too, depending on our thoughts about collapse and decoherence–but we don’t observe things ever going the other way. And entropy is a parameter we’ve discovered that we can calculate, and that we can then observe to be perpetually increasing (provided we draw the appropriate system boundaries in our analysis; we can always have localized reductions in entropy as long as these are offset by more global increases right?)
So we get into this really interesting can of worms right? It’s hard to refute that this seems fundamentally related to time. One could imagine a universe with an absolute time in which time proceeds indefinitely but absolutely nothing happens–like a big clock that never got wound up. But aside from that, it seems that time and change, and change and entropy, are deeply related in our universe. That’s not to say entropy causes time, because I don’t think it causes anything, as you noted. And then entropy is also very much related to life–as living systems function by creating asymmetries in entropy flow.
So the link between time and entropy seems to be irreversibility in physical processes, right? And irreversibility (thermodynamically) relates to the reduction in the quality of energy that exists before and after some process or event occurs. In the classical sense, we may have the same amount of energy, but not in the same forms, and the forms we’re left with are more dilute, diffuse, less potent, etc. The perpetual increase in entropy is really a fancy way of saying things generally wind down, right?
And it turns out that more dilute, diffuse, less potent forms of energy are described by macrostates that simply do have more microstates. I don’t think these definitions are super subjective, however. Meaning, yes, I suppose observers could define them differently, but I don’t think it’s like a quantum thing where different observers draw different conclusions. I think it’s just the fact that the number of possible system configurations that yield identical average properties are more numerous in energetic systems that are more dilute, diffuse and less potent.
I’ve written too much, only because I find it all extremely interesting. I always feel like this suite of ideas is hovering close to a discovery that could possibly change the whole kit and kaboodle in some really interesting way. But I don’t know what that would be. One thought I enjoy pondering is that everything came from nothing: meaning all the forces in the universe sum to zero. All forces could then be described as forms of attraction to an original state. Then, the increase in entropy we see is innate, because it reflects the effort on the part of all forces to dissolve imbalance.
I think this is true in some sense but that it’s also complicated—like the financial system, say, where debts against the original state I’ve posited are transferred, accrued, traded, repackaged, etc. But ultimately, that’s all it is. And because of that, since every element of the whole is seeking a return to its origin—which would be zero, or stillness, say—then it would follow that all interactions we observe proceed in that direction.
What I’m not sure we know yet is whether the imbalance that disturbed the zero state from which all these forces sprang was just a one shot deal—at the Big Bang—or if there might be a fundamental physics we’ve not discovered by which fundamentally new imbalances in the universe come into being. Like if the Big Bang was the mortgage the universe signed, and it took out this huge loan at the beginning, but then various home equity loans might have been taken out along the way as well. There’s just a lot we don’t know on the cosmological scale in my opinion. But do we know enough to rule out with certainty the presence of an organizing (entropy-reducing) principle in the universe that may not be fully understood yet?
Michael
May 28th, 2021 at 1:00 pm
I’ll go back to the top, but your last sentence caught my eye:
“But do we know enough to rule out with certainty the presence of an organizing (entropy-reducing) principle in the universe that may not be fully understood yet?”
In my view, not only don’t we know, but it forms a cornerstone of my theology, the possible hand of God. (I posted at length about this in 2019.)
“…given that those fundamental laws cannot predict [entropy]. (Or can they?…”
That’s a good question. On Mike’s blog you pointed out that reducing a rose to atoms is much easier than deducing the emergence of roses from atoms, although the latter is, in principle, possible. I think the emergence of entropy could be deduced in principle, since it is a consequential behavior.
But it might be really hard to do, at least in some cases. The notion simulated in these animations (glad you enjoyed them!) of a confined “gas” given more room to move seems an obvious one. It certainly didn’t take a ton of code to implement the virtual reality. Other kinds of entropic behavior might not be as obvious?
Hard to say, but my intuition is that entropy is such a fundamental consequence that it might be predictable given a knowledge of system properties. I guess it would depend how well one could model the system. If the model is any good at all, just running it forward (exactly as these animations do) should demonstrate the entropic behavior.
(It might be worth emphasizing that the animations are renderings of a physical model being executed. The system is idealized in multiple ways — it’s 2D, frictionless, collisions are elastic and head-on — but does model a simple set of “physics laws.”)
It occurs to me there might be a difference between trying to derive entropy from just the physical laws themselves versus from models of physical systems that use those laws. It’s the latter where entropy becomes obvious. It’s not particularly reflected in the laws themselves. The code of the animation doesn’t seem to contain it, but running the code sure does.
“…we can always have localized reductions in entropy as long as these are offset by more global increases right?”
Right.
“…it seems that time and change, and change and entropy, are deeply related in our universe.”
Indeed. Entropy is just physics+time. Change, likewise, emerges from physics+time. Entropy is just a species of change. (Change is just a very general notion of different states between two points in time.)
“So the link between time and entropy seems to be irreversibility in physical processes, right?”
To keep this interesting I’ll push back on that a bit. In my view (FWIW), the irreversibility of time is axiomatic (as is time itself). Entropy, as you said above, is just a characteristic of dynamic systems.
Let me put it this way: Time never runs backwards, but entropy can increase or decrease (even without external energy; it’s just hugely unlikely), so it’s not directly linked to time. But it is something that can only happen because time does exist.
“The perpetual increase in entropy is really a fancy way of saying things generally wind down, right?”
Right again.
“I don’t think these definitions are super subjective, however.”
Agreed. One does have to determine a value for Omega, but it’s certainly not entirely arbitrary. And the notion of entropy obtains even if one can’t quantify Omega. The system itself is still statistically moving towards macro states with larger populations of micro states even if the boundaries between those macro states are hard to quantify.
“All forces could then be described as forms of attraction to an original state.”
Question: How does dark energy and that we now believe the universe will expand forever fit into that? In many ways it’s similar to the entropic heat death end of the universe scenario. Would that constitute a return to the zero state?
(As an aside Roger Penrose made a really interesting observation I hadn’t considered. We used to think the Big Crunch was a possible end of the universe scenario. And that maybe that resulted in another Big Bang to restart the cycle. Some wondered if the Big Crunch would result in time running backwards. Part of the argument being the identical natures of expansion and collapse. But Penrose points out the smooth nature of the Big Bang in contrast to the increasing clumpiness of the Big Crunch as various well-evolved objects — especially black holes — merge during the latter stages of collapse. The two would, in fact, look nothing alike. I’d never thought of that, but he’s absolutely right. Moot point now, but interesting.)
May 29th, 2021 at 3:05 pm
Good pushback to keep it interesting. 🙂 In this section you wrote, In my view (FWIW), the irreversibility of time is axiomatic (as is time itself). And also, Time never runs backwards, but entropy can increase or decrease (even without external energy; it’s just hugely unlikely), so it’s not directly linked to time.
I think the question in terms of physics is why processes in nature almost ubiquitously have time directionality. One can imagine, for instance, that time is axiomatic, and it runs forward without ceasing just like you describe, but the processes in nature run the opposite direction from what we presently observe. The basic laws have no preference about this as I understand the state of affairs, right? I believe this is the head-scratcher.
But maybe it’s not a head-scratcher. How, for instance, would one create your animation so that the widely dispersed balls at the end returned to their originally compacted condition? I think that if you knew the positions and velocities of all the particles at the moment the simulation ended, and you reversed the velocity vectors by 180-degrees, it should go back to the low entropy state. Would you agree? That might be a fun test!
Although maybe in your collision algorithms you disrupt the otherwise perfectly continuous laws in such a way that it wouldn’t work. You’d know that right away from how you coded it I suspect. At any rate, what this thought experiment suggests is that the directionality of the physics that take place in an always-progressing-time is dependent on the initial conditions. If the initial conditions of the universe were a bunch of particles on trajectories that, if left to their own devices, would collapse back to the Big Bang, time could still run forward, only the universe would run backwards to how it has to date. We don’t have any reason to believe it wouldn’t, right?
I’m honestly unsure about this. Like… which symbols would we have to flip in our fundamental equations for this to work? If you just inverted the velocities of particles in your simulation, and the physics was just right, then it might get back to the original state. But that’s only for the very idealized state without friction or any other irreversibilities. We’d still have those in any real world case, so it seems like my thought experiment would probably fail in real world cases?
And so perhaps, although I didn’t mean to arrive at this point, this is why it is suggested that entropy is related to the directionality of physical processes in time. Because the irreversibilities are quite real…
Question: How does dark energy and that we now believe the universe will expand forever fit into that? In many ways it’s similar to the entropic heat death end of the universe scenario. Would that constitute a return to the zero state?
Well I did think on that fleetingly while I wrote this, and that is why I suggested maybe the universe keeps taking loans from the Zero State as it goes. That would be the dark energy. The question is whether or not there is anything offsetting, and as I understand cosmology right now there isn’t. But my intuition that everything is somehow related is stronger than my intuition that cosmology won’t go through some major revisions in the future, so my answer is I have no idea. If dark energy is correct it essentially means physics believes in magic I guess, as the universe would be constantly adding energy to its own account. Do you understand that the same way—that dark energy is “appearing” to fill an expanding space-time, in such a way that the absolute sum of all dark energy is an increasing summation with time?
May 29th, 2021 at 5:07 pm
To start with the easy one, yes, my understanding is that dark energy seems to be adding energy to the universe. The amount of dark energy is, as I understand it, fixed per volume of space, so as space expands, there is necessarily more dark energy.
I’ve seen the analogy of stretching a rubber band adding energy to the rubber band, but this seems a case of the rubber band stretching itself, so where’s the energy coming from?
I’ve long wondered if we don’t have some aspect of cosmology wrong. Sometimes I get the fanciful notion that the solar system is all there is, and the rest is a painted backdrop. We have no way to know what’s really out there, and things like dark matter and dark energy seem based on a lot of assumptions that extend physics as we know it to the universe. Wouldn’t surprise me at all if we’d gotten something wrong.
To jump backwards just a little:
“And so perhaps […] this is why it is suggested that entropy is related to the directionality of physical processes in time. Because the irreversibilities are quite real…”
Yes, I think that’s why some think it’s “time’s ratchet” — that it causes time to run in one direction. The implication being, I guess, that if not for entropy, time might just as easily run backwards as not.
Back to the top…
“The basic laws have no preference about this as I understand the state of affairs, right?”
The basic laws work the same, but the emergent processes of those laws tend to have the same space-filling trajectories. Many large-scale phenomena just don’t work in reverse.
Take the canonical broken egg. The usual statement is that just the right forces would reassemble the egg. But how is it possible to fuse shell fragments? The shell is the result of a long process that grew it, and its molecules are bound in that grown matrix. Shoving two pieces together won’t join them seamlessly (and probably not at all). So there’s no combination of “just the right forces” that would reverse an egg breaking. Making is slow, breaking is fast.
“How, for instance, would one create your animation so that the widely dispersed balls at the end returned to their originally compacted condition?”
It would be an interesting experiment and one that might not be hard to pull off.
The model is frictionless, so energy isn’t lost. And it uses elastic head-on collisions, which means velocity vectors just get swapped during collisions. In theory, the same set of velocity vectors randomly assigned at the beginning still exists unchanged (other than having been shuffled among the particles during collisions).
That sounds reversible, but my guess is that it won’t work if the animation runs for any length of time. The reason is that the velocity vectors probably don’t have enough precision to insure the desired final position after 15,000 time steps.
The vector elements are floating point “doubles” with 53 bits of precision. I’d have to try to figure out the math, but after 15,000 time steps the possible final positions will be a set of points with {math} amount of separation between them. There’s no possible starting configuration that can hit the space between them; the points are the only possible destinations.
Looked at the other direction, each time step in the normal animation loses a tiny bit of precision, so the vector at the end has been effectively rounded off. In a sense, the starting position has been lost.
Likewise, I’m not sure large scenarios, let alone the entire universe, can have initial conditions such that it would run backwards to the beginning. There is the egg-shell issue that making and breaking aren’t symmetrical. There is also that reality may not have enough precision to set starting conditions that finely.
(Which, BTW, is another reason why I don’t believe the universe is fully determined. I don’t think reality has the precision for full determination.)
May 30th, 2021 at 9:39 am
You make an excellent point here I think: Making is slow, breaking is fast.
I don’t really know what I’m trying to say exactly, but this gets me thinking about the notion of path dependency, and how even though the fundamental physical laws are time-symmetrical, as you noted, there is a symmetry breaking (apparently) that occurs in “large-scale phenomena.” Although I wonder if you would accept a revision to the phrase “large-scale phenomena” to be something related to living processes–something like this: “processes extended in time that involve sustained relationship-building between the participatory elements.” Not great, but definitions like this are difficult! I am just thinking it’s not the scale, it’s the nature of certain processes–like the organic generation of the egg shell–that mark them as being unique.
The physical laws permit these processes, but there’s something missing perhaps. I mean… it just seems a pretty big leap to get from the fact that, yes, certain atoms will form a bond due to chemical affinity, to the production of a calcium carbonate shell with a cuticle.
Ilya Prigogine in his work on thermodynamics and far-from-equilibrium systems, did some interesting mathematical work that is WAY over my head, but was in essence noting that when we have sustained, “long-term” interactions within a system over time, there is a symmetry breaking that occurs. The system can’t just “go back” very easily. Like it went through a mode shift or something.
This stuff remains fascinating, AND, I find myself agreeing that it’s unclear how entropy could actually cause anything in all of this. It remains a characteristic, and an insightful property to understand, but it’s difficult to see causation…
Michael
May 30th, 2021 at 12:17 pm
As far as living processes, would you consider crystal growth a living process? Biology is one of my weaker sciences, so I could be far off the mark, but I tend to lump egg shells into the class of things that form due to chemical processes more than life processes. (Life obviously is a chemical process.)
That said, the very nature of living things is to grow slowly, so life processes are where the asymmetry really stands out.
The weird thing about the asymmetry is that it’s statistical, not due to some specific physical law. (More like all of them.) As I mentioned above, entropy physically can reverse, but the odds of it are incredibly unlikely. Giant numbers that hurt your head unlikely. It ends up amounting to a law, but actually isn’t one.
I think it’s fascinating reality went from atoms to egg shells, and I quite agree it’s a big leap. It too billions of years to make the trip! But, as we discussed before, one can’t help but wonder what really accounts for how simple rules plus simple building blocks result in such wonders.
“…when we have sustained, ‘long-term’ interactions within a system over time, there is a symmetry breaking that occurs. The system can’t just ‘go back’ very easily.”
That makes sense. Each interaction perturbs the system in some small way, and those add up over time.
Which brings me to the idea you had last time about reversing the flow. My tests all indicate the particles return to the general area of the corner, but not their starting positions. In particular the blue particle never seems to follow its own backtrack. (See video below.)
I suspect the problem is the collision handling code, which does move the particles out of range of each other, and that operation might not be symmetrical. In one test I did with very long run, the particles remained randomized, didn’t even return to the corner area. (I only did one of those, it was late, and it’s possible something else caused that result.)
But it’s kind of wild seeing them all suddenly start heading for the corner!
May 30th, 2021 at 1:45 pm
As far as living processes, would you consider crystal growth a living process? Biology is one of my weaker sciences, so I could be far off the mark, but I tend to lump egg shells into the class of things that form due to chemical processes more than life processes. (Life obviously is a chemical process.)
I think if you dig into it, Wyrd, you’ll find that the growth of calcium carbonate shells in birds requires a great deal more than the processes of inorganic crystal growth could explain. For instance, (just researched this a little because I was curious, and most of the research is on chickens understandably), hens can mobilize up to 10% of their bone mass to provide calcium for shell-making in a matter of hours, and generally speaking, the rate of calcium intake from food cannot keep pace with the needs of shell production. Chickens (many birds?) have ‘bones inside of bones’ that appear to be calcium reservoirs, in part.
Next, the process by which the shell itself is formed is still not fully understood, and like many living processes, relies heavily on enzymes as catalysts to trigger and manage the necessary reactions. Here is one description from a quick web search on shell production, “Theories to explain the formation of carbonate ions center on the enzyme carbonic anhydrase, which is present in high concentration in the cells lining the shell gland. One theory assumes that two bicarbonate atoms are in equilibrium with a molecule of carbonic acid and a carbonate ion, with the equilibrium strongly in favor of the bicarbonate ions. The hypothesis is that the carbonic acid is continuously being dehydrated to carbon dioxide gas under the influence of the carbonic anhydrase, and that carbonate ions continuously diffuse or are pumped across the cell membranes into the shell gland, where they join calcium ions to form the calcite lattice of the growing crystals in the egg shell. An alternative theory, proposed by Kenneth Simkiss of Queen Mary College in London, is that the carbonate arises directly in the egg shell gland by the hydration of metabolic carbon dioxide under the influence of carbonic anhydrase.”
That is from a Scientific American article from 1970 by T.G. Taylor, but I wasn’t able to quickly find any other description as detailed online. Point being: it’s more of a living process I would say than a simple inorganic chemistry problem.
As to crystal growth in general, it all depends on how we define living I guess. I don’t have a standard answer on that really. I know that some crystals in clays (I think) can have patterns that reproduce and seed new crystals and that this is related to one or another of the early theories of life, but that’s clearly a process we’d term inorganic I think. Or maybe it’s in the twilight between the two.
The weird thing about the asymmetry is that it’s statistical, not due to some specific physical law.
I am not certain that it’s all statistical, at least in Prigogine’s writing, though I am familiar with the notion that the directionality of entropy generation is related to the statistics of how many states there are with lower entropy than those with higher.
If my memory serves, Prigogine sees systems going through dynamic transitions that are something like “gates” in chaotic systems. Once the system goes through that needle eye, it can’t just go backwards. So would we call that statistical? Meaning, I think something different is going on than what we typically mean by statistics, which is that all states being equal, if 99% of them have lower entropy than the current one, we can reasonably expect the system to find its way into a lower entropy condition. The states available to systems with particular histories are limited, not by fundamental law perhaps but by the novel characteristics they’ve produced. So unlike the statistics notion that all states are somehow available, they’re not!
I’m definitely spitballing here… Haha. But I mean, in very abstract terms, if there is an activation energy hurdle that a dynamic system is able to coax a subsystem across, then it may not easily go back. Would we call that statistics?
My tests all indicate the particles return to the general area of the corner, but not their starting positions.
Thanks for doing this! That’s indeed pretty wild. There’s no obvious indication before the last few seconds of the video that it’s all about to end up in the top left corner. It’s almost freakish when we see that start to happen. What’s even more interesting is that despite the inaccuracies in the algorithm you described… it still largely returned to a close approximation of the original state. That to me is unexpected. I would have thought that the chaotic nature of dynamic systems like this would not have led to this result!
May 30th, 2021 at 1:58 pm
PS – I think Prigogine’s Nobel Lecture has quite a few intriguing statements on these topics… I’m going to read it sometime soon. Here’s one, “We see therefore, that the appearance of a periodic reaction is a time-symmetry breaking process…”
May 31st, 2021 at 5:21 pm
I didn’t realize that egg shells were among science’s mysteries, that’s kind of cool. (I’m a bit fascinated by the things we don’t know after so many centuries of science.) And while egg shells are obviously and clearly a life process, your own account does show how chemical that process is. Which may explain the mystery. Chemistry has a lot of unexplored and unknown territory.
The original question was about “making is slow, breaking is fast” and how much connection that has with life processes. Certainly life processes — probably most of them — are like that, but even rock formation processes, let alone crystal grown, are also slow processes. I think the notion of slow formation is broader than life processes (although those are very much in the zone). I think there is a great deal of generality to the idea. (How about stars? They take a very long time to form, for the gas to collapse into a knot that ignites, but supernovas are brief. Or think about how long it takes to build a house of cards!)
“As to crystal growth in general, it all depends on how we define living I guess.”
“Life” is another one of those very definition terms. FWIW, I probably would not include crystals, but I’m not deeply committed to the position. I don’t even know what to think about viruses, but above those I think we’re talking life. With crystals, as you said, I think we’d see those as inorganic.
“Meaning, I think something different is going on than what we typically mean by statistics, which is that all states being equal, if 99% of them have lower entropy than the current one, we can reasonably expect the system to find its way into a lower entropy condition.”
I’m not sure if we’re on the same page. The idea that 99% of states would have lower entropy. Are you intending a counter-example of some kind? Normally the bulk of states in any system has higher entropy.
Which is all I mean by statistics, just that there are many more high-entropy states available to a system than low-entropy ones (almost by definition). There is also — and this gets at the irreversibility and histories — that the high-entropy states are adjacent to other high-entropy states. The path back to low entropy leads through all the levels from high to low, but each downward step is statistically unlikely. So even if the system does move towards low entropy, the odds are it’ll turn around again almost immediately.
“But I mean, in very abstract terms, if there is an activation energy hurdle that a dynamic system is able to coax a subsystem across, then it may not easily go back. Would we call that statistics?”
I don’t quite follow what you’re getting at. Can you give a for instance?
“What’s even more interesting is that despite the inaccuracies in the algorithm you described… it still largely returned to a close approximation of the original state. That to me is unexpected.”
I was a bit surprised, too, although in theory it’s what we should expect.
It’s worth mentioning that the only way to accomplish this at all, is to start with the low-entropy system and run it forward to some point. That’s the only way to get the velocity vectors. Attempting to derive them cold would be impossible.
I should also mention that I reduced the time span because the longer the animation runs forward, the worse it does running backwards. I also reduced the clock speed so particles moved in smaller increments. I hoped that would reduce any asymmetry the collision handling introduced. If I pursue this with the eventual goal of pool balls, I’ll need a much better collision algorithm, and it would be interesting to revisit this with something I was more confident was fully deterministic.
Right now I have no idea, and no way to quantify, how much of this is due to floating point precision versus a putative asymmetry in collision handling.
“I would have thought that the chaotic nature of dynamic systems like this would not have led to this result!”
Remember that chaos is fully deterministic; we just can’t predict what a chaotic system will do from initial conditions. It’s the same as calculating the velocity vectors for a set of particles to run backwards to a specific state. This is a chaotic system, so it’s not possible to figure out the final position of those velocity vectors other than by calculating all the steps.
It’s another version of that point you made about reducing a rose to atoms is much easier than deducing a rose from atoms. (Possibly for similar reasons. There is probably chaos in how evolution winds it way to roses and anything else.)
May 31st, 2021 at 6:17 pm
Just saw this and had to clarify a blunder on my part. In my account of statistics, I used the word “lower” instead of “higher.” I had this visual in my mind of the “low” entropy state being a small area like the top of the mountain, and all the higher entropy states fanning out below with a much wider area. And so I translated that mental image to the text incorrectly. But yes, I’m in agreement on the notion of statistics: there are MANY more possible states for a system to occupy as entropy increases, and if you look at all the states, the lowest entropy ones would be statistically quite rare as you note. You also note they are path dependent–you can’t get there from here a lot of the time.
Okay, as to an example of activation energy and processes that you can’t really just run in reverse… well, let me go right for something complicated and skip a lot of very important steps to try and make a point. The human body is a structure that one could argue began with a bunch of dirt and gas. Now if we didn’t have microorganisms, when we died our bodies would not decay, so they’d just sit around after they dried out or whatever. The point being: you can’t just run the body backwards to get back to dust and air. It doesn’t work. And that’s because there are actually energy barriers preventing you from doing so. The chemical bonds in tendons that the body has constructed for instance don’t just fall apart. They have to be PULLED apart. The process of the body is so damn complicated and has bootstrapped itself through to many gates that–unlike a ball that you push to the top of a hill, and will simply roll down–the body just won’t simply reverse direction. You need lots of profoundly clever enzyme-bearing deconstructing organisms to take a body apart.
Or just to think of a body running in reverse, I’m not sure it could function in reverse time, all else being equal. If you reversed the direction of time the enzymes and catalysts we have may not work. Is it true that an enzyme that sponsors a particular reaction in the forward direction works equally well in reverse if time is flipped? I have to say that I don’t think that would always work. I believe we need/have separate enzymes or catalysts for taking things apart as compared to assembling them. So while chaotic systems are deterministic, just unpredictable without walking through all of the steps, the point is that the novelty produced may not permit a simple backtrack…
I’m not 100% sure on this; it’s an interesting topic to ponder!
Michael
June 1st, 2021 at 11:09 am
You and me, both. There is something about entropy that makes “low entropy” and “high entropy” somehow seem backwards to me. Certainly the notion of going downhill to lower entropy makes sense. (I use that mountain metaphor a lot. A low entropy state is very much like balancing on a mountain top, whereas high entropy states are like the flat plains.)
“The chemical bonds in tendons that the body has constructed for instance don’t just fall apart.”
Ah, yes, I see what you’re saying now. Very true! Even water takes energy to return to oxygen and hydrogen.
“If you reversed the direction of time the enzymes and catalysts we have may not work.”
As I mentioned, biology isn’t one of my strong sciences, but I suspect those very energy barriers you were talking about make a lot of chemistry asymmetric in time, so I’m sure you’re right. (Some of the more energetic reactions — explosives — certainly can’t be reversed.)
(As an aside, while chemistry also isn’t a science I’ve pursued much, I’ve read books by chemists working in explosives or rocket fuels, and it’s amazing some of the crazy stuff they work with. Nitrogen compounds, for instance. Nitrogen does not like binding to stuff, so when you can get it to do so, often you end up with something that explodes. Violently. Often if you just look at it wrong. Or even if you don’t.)
((Or fluoride compounds so corrosive they can set concrete on fire. Chemistry is crazy!))
“So while chaotic systems are deterministic, just unpredictable without walking through all of the steps, the point is that the novelty produced may not permit a simple backtrack…”
I think that’s the reality of it. In theory a truly deterministic system, even a chaotic one, can be run backwards at least as a description if not in practice. For instance, if we know exactly what the enzyme reaction is, it’s actually trivial to describe it in reverse. Same as how reduction is easy.
So we can say what happens in reverse, but marshalling the forces to accomplish it may not be possible. That’s the theory, but very much along the lines we’ve been talking about, I have a hard time seeing how, at least in some cases — egg shells, biological reactions, really any slow process — it’s actually possible. And I’ve come to believe it isn’t. You just can’t run time in reverse with large scale systems.
(By which I mean anything above basic physics. It’s basic physics that people mean when they say physics has no preferred direction in time. But I tend to think that from chemistry on up, processes are asymmetric in time.)
May 29th, 2021 at 8:18 pm
Hmmm… early results on reversing the animation aren’t encouraging. The particles return to the general area, but not at all to their starting positions. Video(s) to follow…
May 29th, 2021 at 8:24 pm
Just for fun, here’s the next version. In this one I include a tracking of the blue particle. It’s fun to see how its path changes.
May 30th, 2021 at 11:59 am
Here’s a new video that tries Michael’s suggestion about reversing the velocity vectors after the model has run for a while to see if the particles return to their starting position. If the model is fully deterministic, then it seems they should.
As it turns out, they don’t:
I suspect the reason is that the collision handling routine, which moves the particles just out of collision range with each other, isn’t symmetrical. I tried this several times (this video is just the last one), and often the blue particle goes flying off-track almost immediately, which isn’t accumulated error but something that’s different about reverse than forward.
Even if I could figure out the asymmetry and correct it, I’m not sure the particles would return exactly to the starting position because I don’t think the velocity vectors have the precision to arrange for exact enough final positions after 15,000 time steps.
Still, it’s bizarre seeing what seems a random flow of particles suddenly start seeming much less random and then all flowing to the corner. At the end it looks like a movie in reverse, but until that point reverse looks the same as forward.
June 5th, 2021 at 3:11 pm
[…] based on a suggestion from Michael, I tried an experiment in reversing time in […]
July 14th, 2021 at 7:43 pm
There are enough video links on this page that I don’t want to add one more, but see this page for a link to a video clip of an interview with theoretical physicist Lee Smolin talking about time being fundamental.
I can only add, “What he said!”
March 29th, 2022 at 8:31 pm
FWIW: I still think time is fundamental, but I’m no longer sure space is. Quantum entanglement challenges the notion that space is fundamental; it may emerge from something deeper. (Some theorists think that either time or space could be fundamental, but not both.)
January 19th, 2023 at 11:22 am
[…] don’t believe entropy causes time. I believe entropy is the result of the laws of physics plus time, and I believe time is […]