Over the decades I’ve seen various thinkers assert that entropy causes something — usually it’s said that entropy causes time. Alternately that entropy causes time to only run in one direction. I think this is flat-out wrong and puts the trailer before the tractor. (Perhaps due to a jack-knife in logic.)
The problem I have is that I don’t understand how entropy can be viewed as anything but a consequence of the dynamical properties of a system evolving over time according to the laws of physics. Entropy is the result of physical law plus time.
It’s a “law” only in virtue of the laws of physics.
I’ve posted about this before (see: Thinking About Time and Time and Thermodynamics), so I won’t repeat myself (at least not much). This post is more about a little project I had some fun with this past week. As well as being its own goal for this post, the project is a step on the way to a related bigger project I’ve had in mind for some time.
I’m not as visual as some visual artists I know (just like I’m not as musical as some musicians I know), but I am visual (and musical), and I enjoy things like Mandelbrot zooms, visualizations of John Conway’s Life game (and myriad related videos), along with other animations of mathematics.
It’s not just the randomness of the visual, although I do also enjoy clouds, water waves, campfires, and other random visuals. (Ever watched the leaves of a distant Aspen shimmer in the wind? Awesome!) It’s also thinking about the math behind the animation. As I’ve written about before (more than once), the Mandelbrot is especially mind-blowing in that regard.
Getting back to entropy, I’ve always liked those animations of particles that start off with all the particles concentrated in one corner and show how over time they spread out to fill the space uniformly.
I think it’s fun watching the little balls bounce around, and it’s interesting to focus on one to see what happens to it over time. Unfortunately, those animations are can be short, small, of bad resolution, drawing, or execution. Which gets me thinking, “Hmmm, I wonder if I could do something nice?”
Something like this:
And, hmmm, it turns out I can.
Surprisingly, it only took a few days to create. I would have finished sooner, but the first collision detection algorithm, in an attempt at being very fast, turned out to be very problematic (registering a single collision as multiple events due to overlap).
Doing actual distance calculations is slower, but I realized I don’t care about the true distance, only whether that distance is below a certain value. That means I can skip doing the square root in the distance calculation:
Which does speed up the code. (And doesn’t affect the animation — just the time it takes to create the frames.)
That worked great once I killed an amusing bug in the collision handling routine. In creating the vector between the colliding particles, I used the absolute delta values, so the vector always pointed into the (+,+) quadrant. But it needs to point in any direction, which requires true delta values, not absolute ones.
What was amusing was that the vector is the basis for moving the two particles outside of collision range. Since it incorrectly always pointed in the same direction, it only pointed in the desired direction in 25% of the collisions. In another 25% it pointed in the exact wrong direction, so particles moved towards each other, putting them even deeper into collision range. (The final 50% jumped sideways.)
Fixing the vector was one of those tiny bits of magic — just removing the two
abs() functions (I’m not sure what I was thinking when I used them) — that made all the difference. It was a good illustration of the old saw about “for want of a nail…”
(If you are curious, you can see the bug in action in earlier versions on my YouTube channel. Mainly note how some particles seem to couple together. In the version 1.x videos it’s because collision detection registered multiple events so particles react, then react again and again, for how ever many event registered. In the version 2.0 video it’s because of the vector bug. In that one, also note the particles making funny little hops when they collide due to the vector pointing the wrong way. Either way, the result is they stay close and keep colliding.)
This project began last Friday evening as I sat idly pondering a note I’d written: “Entropy Isn’t A Force!” It came from having seen yet another theorist putting the entropy cart in front of the time horse. It’s one of those modern “scientific” views I find hard to fathom. I don’t see how entropy can be framed as anything other than an emergent process.
Of course, it’s always possible I just don’t get it, but until someone presents an explanation that frames entropy as fundamental in a way that’s sensible, I’m dubious entropy is anything but an emergent consequence of physics plus time.
In this version I wanted the particles to be all the same color, except for a handful (one in ten is red). I thought that would make it easier to follow the red ones. And it is, but I meant for there to be fewer red ones. During testing I did runs with fewer particles, so I used 1/10, but with 800 particles 80 was more than I wanted. I just forgot to change the setting before the run (and was too lazy to redo it).
Note that there’s also a “golden snitch” although it’s actually a blue particle. Unfortunately, the blue doesn’t contrast well with the gray. I should have used a different color. (Maybe actually golden!)
One thing about these simulations is that they’re frictionless and all collisions are head-on and elastic. Further, there are no glancing blows that partially transfer momentum. With head-on elastic collisions, one simply swaps the momentum vectors. Glancing blows require more involved calculations.
Think about hitting a motionless eight ball dead on with the cue ball. All the momentum is transferred. The eight ball takes off and the cue ball comes to a stop. They trade momentum vectors.
(See the elastic collision Wiki page for some nice examples as well as a nice particle animation.)
A glancing blow results in new momentum vectors for both that combine and then split the original ones. Both the direction and energy change to new values.
As an example, imagine a barely glancing blow that changes the cue ball’s path only slightly while the eight ball slowly rolls away at an extreme angle.
I didn’t want to bother with that for this, but I will for the bigger project I have in mind. I want to create a physically accurate animation of pool balls. Then I want to create side-by-side animations with the same initial starting conditions — save for one tiny variation — and see how chaos causes the trajectories to diverge. (The animation is otherwise fully deterministic, so the exact same starting conditions would always result in the same animation.)
For this project all I needed was particles with a lawful behavior. They didn’t need to reflect real particle dynamics. The point I’m demonstrating is that my simple algorithm results in entropic behavior merely in virtue of applying its rules over time. The entropy it simulates is a consequence.
Recall that entropy is (some constant times) the log of a quantity called Omega (O), which is the number of macro states of a given system. Note that this requires some definition of a macro state, so one argument against entropy being fundamental is that it depends on observer definitions. What constitutes a macro state?
So entropy has a virtual aspect as a measurement. While the videos illustrate entropic behavior, the actual video, and the process of watching it, have essentially the same entropy. The physical entropy of the system far outweighs any actual influence of the video content.
(By the way, information theory uses entropy a bit differently. I find that use a bit suspect, but that’s a post for another day. The point here is that entropy is a measurement or observation of system behavior, not a driving force.)
This new and improved version has 50% more particles than other versions!
With 1200 particles, it took a good fraction of a day to generate the 15,000 frames. To check for collisions, each particle must check its distance from all other particles. The total number of checks is the partial sum of the number of particles:
The partial sum of 800 is 319,600; the partial sum of 1200 is 719,400. Adding 50% more particles more than doubled the number of checks. The check itself involves squaring the delta-x and delta-y, summing them, and testing them against the collision limit, which isn’t too bad. But particles that do collide require handling that is more involved.
Worth the wait. I find the higher density of particles mesmerizing.
My view does turn on time — including that it flows forward — being a fundamental property of reality. (I likewise think space is a fundamental property.)
When forming an ontology, something has to be fundamental (unless one’s ontology involves “turtles all the way down”). A key distinguishing feature of an ontology is what it considers axiomatic. In my view, the simplest ontology comes from taking time and space as fundamental properties of existence. They are things that just are.
I find support in Kant’s view that our transcendental idealism likewise has time and space as fundamental aspects of our intuition — frames that define our every thought. I would argue that time and space are so fundamental to our minds because they are indeed fundamental properties of reality.
In contrast, the view that entropy (or in some views, change) is fundamental requires that time be secondary and emergent. (I understand theoretical physicist Carlos Rovelli to take this view.) It requires a definition of entropy (or change) that has no notion of time — no equation with a t variable even implied.
(Regarding “implied”, a simple example: Newton’s F=ma, at first glance, has no time variable, but acceleration (a) is the change in velocity over the change in time. For that matter, velocity is the change in distance over the change in time. It turns out that nearly every physics equation ultimately depends on time.)
((Along the same lines, the very basic physical experiences of velocity and acceleration, as just mentioned, are emergent properties of motion through space and time. Their very definitions reference the passing of time.))
To me, fundamental entropy (or change) seems a more complicated and unlikely ontology.
I think time must be fundamental to provide a context in which the Big Bang occurred. Further, Minkowski spacetime singles out time — it has the opposite sign — from the three spatial dimensions.
But that’s just my opinion. Perhaps time and space emerge from the Big Bang and other dynamic laws of physics, but given the mathematics of the standard model, quantum mechanics, and general relativity, I’m dubious.
Time is sometimes said not to exist in physics (mostly because basic physics works the same regardless of the sign of the time variable — i.e works the same forwards and backwards), but I think the truth is that it’s so fundamental and ubiquitous that it effectively vanishes from sight.
These videos are meant to illustrate how entropy emerges from the lawful behavior of the system. The program only knows about the physics (such as it is) of the “particles” — there is no entropy to be found in the code. Entropy emerges in consequence of the (virtual) dynamics.
Here’s a last video for dessert. During the development process I created some that are another form of the kind of random pattern dynamic art I mentioned at the beginning:
It’s an animation of particle movement with no collision detection (but they do detect the walls). Rather than create a new image for each frame, I kept adding on to the same frame, so the particles leave trails. It’s just kind of a wild thing to watch. The trails give it a 3D feel and depth.
The particle videos are ten minutes at 25 frames per second, so the code generated 15,000 frames of 1920×1080 24-bit PNGs. Other code generates a few hundred more frames for the fade in and fade out.
Stay entropic, my friends! Go forth and spread beauty and light.