Whatever I find interesting

Incoherent Decoherence

An awful lot of confusing stuff is written about quantum measurement, decoherence, bell’s theorem, etc. etc. I don’t have any answers, but I do at least want to capture the right questions.

What are the issues and non-issues

We all know that quantum systems behave differently from classical ones. We also know that everyday large objects behave classically despite being composed of quanta. The first question is therefore:

  • Why is it reasonable to model a grain of sand like a particle and ignore its wave structure, but not reasonable to do the same for an electron.

And the answer has been known ever since QM was developed – because assuming the energies we are working with are low enough not to threaten the integrity of the grain of sand, we can treat it as a single wave packet with an absolutely tiny wavelength, and normal unitary evolution won’t spread it out noticeably even in a timescale comparable to the age of the universe. So if everything we are dealing with is similarly macroscopic, our quantum system behaves classically, and nothing gets ‘smeared out’. By comparison, the integrity of a particle-like wave packet for a free electron will disappear in a fraction of a second.

  • What happens if we attempt to create a macroscopic superposition of two different states a la the dreaded cat?

The quick answer is that it will deconhere so quickly you’ll never know it existed. We know from quantum computing and other experimental situations that it is difficult to maintain entanglement between obviously quantum systems like the qbits we create in quantum computers. After all there’s an environment even in a vacuum ( all those quantum fields to excite ) An attempt to create a macroscopic superposition by entangling say an electron with undefined spin direction with a detector is what we call a measurement, and the result is always that the macroscopic system shows a definite result which depends in some way on the state of the measured electron, but loses the potential uncertainty. This imposition of certainty during decoherence must of course affect the entangled electron, and this is what causes the post-measurement discontinuous change in the state of the measured electron.

If the macroscopic system is designed in such a way that the entanglement can be reversed, ( recombining beams in a stern-gerlach experiment say ) then there is no decoherence, and no effect on the electron.

  • Where do the probabilities come from?

When we create a superposition of different macroscopic states entangled with a quantum system we are measuring, the amplitudes of the quantum system will correspond to those of the superposition, and it’s reasonable to expect that the decoherence will lead to the various different outcomes in a way which preserves the usual probability calculation, but I haven’t seen this proved.

  • So is that it for quantum weirdness?

Well, no. Decoherence explains what measurement is, why it causes an apparent discontinuity in the wave function, and why macroscopic systems look classical. It also implies that the entire universe can be a deterministically evolving quantum system of unimaginable complexity.

But it doesn’t explain how distant, entangled systems preserve the Bell correlations. Somehow, the physical, classically verifiable result of certain experiments conducted in places too distant for subluminal signals to be used can display correlations inconsistent with any attempt to explain them on the basis that the actual choice of outcome is predefined within some kind of extra state within the measured quantum systems.

So when we entangle our entangled systems with two different widely separated macroscopic environments, we now have both macroscopic environments entangled. But how does the necessary correlation persist as the inevitable decoherence occurs?

Leave a comment