In the last four posts (*Quantum Measurement*, *Wavefunction Collapse*, *Quantum Decoherence*, and *Measurement Specifics*), I’ve explored the conundrum of measurement in quantum mechanics. As always, you should read those before you read this.

Those posts covered a lot of ground, so here I want to summarize and wrap things up. The bottom line is that we use objects with classical properties to observe objects with quantum properties. Our (classical) detectors are like mousetraps with hair-triggers, using stored energy to amplify a quantum interaction to classical levels.

Also, I never got around to *objective collapse*. Or spin experiments.

The basic motivation for *objective collapse theories* to preserve the linearity of the Schrödinger equation during measurement by extending the mathematics to account for the nonlinear jump of the state vector.

The *objective* part is that the wavefunction collapses (localizes to position eigenstates) for concrete reasons. Environmental elements, such as time or gravity, are key factors in why it does so. Large objects collapse, evolve briefly, and re-collapse constantly and rapidly. Quantum systems that interact with large objects collapse quickly under the large object’s influence.

*Spontaneous collapse theories* are a subset with an emphasis on time. Wavefunctions collapse *spontaneously* after some period of time (a little bit like a half-life decay). Collapse is also environmentally dependent, so isolated systems decay slowly. (All these theories, of course, must match what we observe experimentally. They need to account for why tiny, isolated systems can be quantum but why large systems can’t.)

I like the basic idea of objective collapse theories. I think they help explain how the classical world emerges from the quantum one. I lean more towards ideas about object mass and gravity (plus the environment) than towards ideas involving time. I like the Diósi–Penrose model. Its link with gravity (the “missing link” of quantum) intrigues me. (The notion of a wavefunction “half-life” is attractive, though. Uncertainty adds some time fuzziness in any event.)

These models are not without their problems. They are criticized for not conserving energy (because of noise), but better models may fix that. (And I wonder if a gravity-based model might be less prone to noise.) And there is still the issue of the nonlocality of wavefunction collapse.

Bottom line, though, I think the notion of objective collapse is a step in the right direction. (And I think gravity from the mass of objects is an important factor.) It explains our experience of a classical world — reality observes, or at least collapses, itself.

**§**

I have long wondered why the measurement problem isn’t just a matter of adding *some* kind of math to the Schrödinger equation. That seems obvious somehow. The Schrödinger describes the evolution of a quantum state like a parabola describes the arc of a baseball. Both describe an *uninterrupted* process.

The equation for a parabola describes only the flight of a baseball. It doesn’t account for it hitting the ground or a wall (or a bird) — terms have to be added to the parabolic equation to define those interactions. Likewise, it seems the Schrödinger equation needs terms to describe the interaction of a measurement. In fact, applying an operator is the mathematical equivalent of making a measurement, so I’ve never been entirely clear why that doesn’t make measurement a *non*-problem. At least in terms of the non-linear vector jump.

Note that, in many kinds of measurement, the quantum system being measured disappears. For example, when an electron absorbs a photon, the photon vanishes. Its wavefunction vector doesn’t just jump — it goes away entirely!

Sometimes I wonder if the measurement problem isn’t something of a misunderstanding. Why would we *expect* the Schrödinger equation to evolve linearly when its system encounters a much larger object, one presumably governed by its own Schrödinger equation Why don’t we *expect* that sudden interaction to have a sudden effect on the wave states? Definitely a huge one on the quantum system (assuming it’s even still around). Possibly an indistinguishable one on the large system as far as its *quantum* state. As discussed last time, its *energy* state and physical configuration can shift suddenly and massively.

This still leaves the nonlocal spooky action at a distance of the wavefunction vanishing everywhere, the shift from dispersed wave to point-like particle, but entanglement experiments seem to make quantum nonlocality something we may have to just accept.

**§ §**

In the last post, I didn’t talk about spin experiments. Many modern ones involve single particles, so the noise issues I discussed last time apply. A common example is detecting photons that pass — or don’t pass — through a polarizing filter. There is, firstly, the issue of knowing when a photon is present, and secondly, the problem of false negatives and positives due to noise.

[I believe it’s more accurate to visualize the laser as generating a continuous EM field on the borderline of having one quanta (photon) of light energy per a certain beam length. Variation and uncertainty randomizes exactly when there is enough energy to extract one photon. The mystery, as always, is why a particular electron is “chosen” to extract the photon.]

Some experiments use large numbers of photons (or other “particle”) to build statistics that rise above the noise. Geiger counters, for instance, depend on lots of events to signal the presence of significant levels of radiation.

Large numbers may also bring out patterns single particles don’t demonstrate. While individual particles do interfere with themselves in the two-slit experiment, it’s only the pattern that builds up over time that demonstrates the interference behavior. And many entanglement experiments need lots of data points to demonstrate the quantum correlations of entangled particles.

The original Stern-Gerlach experiment spin experiment is worth mentioning. It used silver atoms, which have spin because of the unpaired electron in the outer shell. Results depended on lots of silver atoms accumulating on a glass screen — enough atoms to observably block transmission of light (and silver is nicely opaque).

It’s a striking experiment because it creates a very concrete classical record of a quantum behavior. A big part of the amplification here lies in using lots of silver atoms to build up the record over time. Another aspect lies in the energy of the magnetic field; that’s where the actual measurement is made.

There is something of a question of exactly when the silver atoms are localized. Is it upon passing through the magnetic field, or not until they splat against the glass screen? I have seen accounts that view the two possible flight paths as superpositions of up and down measurements until the atoms strike the screen. This in analogy to beam-splitter experiments where the photon goes one of two ways. (Two-slit experiments also put the two paths in superposition.)

I think the atoms get localized at least in the magnetic field. They’re already bathed in the environment and may have localized even before they interact with the magnet. The interaction with the magnetic field aligns their spins and deflects their path, which certainly changes their wavefunction. There is a question of how rapidly this occurs — is there a quantum jump (a wavefunction collapse) or some sort of progressive interaction? Does the path kink or bend?

The snap of a kink wouldn’t surprise me, but regardless, I think the magnetic field “measures” the silver atoms, which localizes them. The interaction depends on the location of the atom as it passes through the field, which collapses its position. From then on, it’s a “classical” silver atom flying through the air and colliding with the glass screen.

**§ §**

To wrap things up:

There is a wave/particle duality to matter. It only emerges at small (quantum) scales. It’s not apparent at large (classical) scales. Quantum systems do have wave behavior, but it’s not like classical wave behavior. In particular, interference between waves works differently — quantum systems combine *probabilities*, classical systems combine *energies*. In fact, a key difference with quantum systems is that they are probabilistic at their core (via the Born rule). Classical systems are fundamentally deterministic.

The measurement problem is the tension between the linear deterministic evolution of the quantum state and the nonlinear probabilistic jump of “collapse” (or “reduction”) to a measurement state. The former is a strong article of faith while the latter is so far without a solid explanation. (Or even a nearly universally accepted one.)

There are three distinct problems: Firstly, the nonlinear jump from evolving state to measured state. Secondly, the “spooky” nonlocal vanishing of the wavefunction everywhere instantly. Thirdly, the apparent randomness in determining which “particles” interact. The urgency of the first two depends on your ontological vs epistemic view of the wavefunction. (And interference effects lean hard in the direction of ontological.) The third one applies in all cases. It might be axiomatic that reality is somewhat random.

I don’t *worry* about wavefunction collapse. I think about it, but I don’t fret the disconnect between QM linearity and classical measurement. I think it’s very possible the wavefunction is just our Ptolemaic view of reality — almost right, enough to work very well, but missing a key insight. I don’t think the Schrödinger equation is the last word, just a great approximation, so the difficulties don’t bother me. I think solving the deeper quantum mysteries will allow us to understand measurement.

I think those deeper mysteries involve ** quantum superposition**,

**, and**

*quantum interference***. Because of its connection with the wave-like side of things, interference strikes me as especially interesting. But understanding the limits of superposition would be very helpful, too. As just mentioned, there are also the mysteries of apparent**

*quantum entanglement***(why**

*randomness**that*electron?) and

**(which is an aspect of entanglement). And, of course, what exactly**

*nonlocality**is*the wavefunction?

Ultimately, I do think there is a Heisenberg Cut where classical behavior emerges and quantum behavior is swamped or averaged out. I think large objects decohere in the noise of 10²³ singers. And perhaps objectively due to mass and the gravity gradient it creates.

The big conflict between QM and GR suggests at least one, if not both, are significantly missing the mark. It’s quantum vs smooth; linear vs non-linear; background-dependent vs is the background. (That GR is a nonlinear physical theory seems points in its favor.)

**§**

So, bottom line, what do I think “measurement” and “collapse” is?

I can only guess: “Particles” in flight are single-quanta vibrations in the relevant particle field. Since the single quanta of energy is spread in a wave, the field vibrations in any one area are very much sub-quanta. I assume the wave moves at lightspeed (or whatever speed the particle does). At some spacetime point along the way, nature decrees an interaction. All the energy related to the “particle” instantly “drains” into a point-like interaction with another field, either starting a new wave or affecting an existing one.

However, this just equates *“drains”* with “collapses”, so it still invokes nonlocal magic. And in *“nature decrees”* it invokes the random selection magic. We do seem stuck with a (quantumly) nonlocal and possibly genuinely random universe. I’ve always been fine with that — prefer it, even — so I’d be willing to take them as axiomatic. (Doesn’t mean we stop investigating. They might not be.)

But bottom line, I think measurement is a mousetrap. The detector is a coiled spring with a hair trigger. Along comes a quantum mouse, and… *snap!*

**§ §**

And on that note, it’s time to move on to other things! (Not that there isn’t a great deal more to say, details to fill in, topics to revisit, elaborations to unpack. Like all good rabbit holes, Alice can explore forever. But I’m ready for a change of pace; I’m sure you are, too!)

Stay objective, my friends! Go forth and spread beauty and light.

∇

April 13th, 2022 at 4:26 pm

Have you heard the latest buzz? The experimental mass of the W boson has been measured to be higher than predicted by the standard model. As always, the dust needs to settle. What’s interesting to me is that the weak force seems like one place where the SM cracks show a bit.

April 14th, 2022 at 12:09 pm

Epicycles came from real planets in real orbits but seen from the wrong perspective. Likewise, the wavefunction seems to be, or at least to represent, something real, but we may be seeing it from the wrong angle.

If it is true that quantum mechanics, as it stands today, is our Ptolemaic view of reality at that scale, then obviously we need a quantum Copernicus.

(I’ve wondered if Roger Penrose might be someone along those lines.)

April 18th, 2022 at 12:56 pm

Reasons to Doubt GRSingularities. Does nature actually allow them?

Information Loss paradox. Is information conserved? (If so, what symmetry does it come from?) Nature of event horizon is unknown.

Dark Matter observations. Is it a particle or MOND?

No quantization of matter or energy.

Galileo → Newton → Einstein → ???

April 18th, 2022 at 1:07 pm

Reasons to Like GRNonlinear physical theory that explains our experience of spacetime.

Extremely well-tested.

Single unified notion relating mass and the curvature of spacetime.

Fundamentally intuitive.

April 18th, 2022 at 1:02 pm

Reasons to Doubt QMAssumes fixed spacetime background (same as Newton).

No gravity. (No curved space.)

Linear evolution and the “measurement problem”.

Uses complex numbers and large-D Hilbert spaces. Raises question of wavefunction ontology.

Heisenberg Cut: How/where/when does the classical world emerge?

Non-classical behaviors: Interference; Superposition; Entanglement; Nonlocality; Randomly Selected Interactions; Heisenberg Uncertainty.

QM is group effort and basically a first cut at solving observational data. Collection of theories with no direct connection to physical reality or experience. No major quantum revolutions.

No clear physical meaning. No clear interpretation of the math.

April 18th, 2022 at 1:08 pm

Reasons to Like QMExtremely well tested.

April 20th, 2022 at 1:49 pm

A nice video about the Stern-Gerlach experiment:

June 7th, 2022 at 1:02 pm

[…] Earlier this year I posted a five-part series about the measurement problem in quantum mechanics (see Quantum Measurement, Wavefunction Collapse, Quantum Decoherence, Measurement Specifics, and Objective Collapse). […]

June 15th, 2022 at 9:18 pm

Another possible contributor to the Heisenberg Cut: the wave-nature of matter. In particular, the frequency of those waves [see What’s the Wavelength?].

Physics’s modern spherical cow, the 1 gram paperclip, sitting “motionless” on a desk (we’ll give it a velocity of one micron per second due to vibration), has a wavelength of:

In comparison, the charge radius of a proton is a whopping in comparison. The paperclip’s wavelength is

six orders of magnitude smallerthan radius of a proton. (OTOH, it’s a vast twelve orders of magnitude larger than the Planck Length.)To the extent that superposition and interference depend on the wave-nature of matter, the wavelengths of macro objects put such effects (at least in paperclips) in a domain much smaller than protons.

So, the wavelength of macro objects contributes to the Heisenberg Cut, is my point.

June 22nd, 2022 at 4:20 pm

[…] be just the de Broglie wavelength that determines the Heisenberg Cut. As I posted about in Objective Collapse, I like the Diósi–Penrose objective collapse model that depends on gravity from an […]