Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,047 questions , 2,200 unanswered
5,345 answers , 22,709 comments
1,470 users with positive rep
816 active unimported users
More ...

  Given decoherence, do we still need random quantum jumps and interpretations?

+ 5 like - 0 dislike
4702 views

Environmental decoherence explains how the wavefunction of a quantum system q, as a result of the inevitable interactions and entanglement with the environment, appears to collapse upon measurement (Von Neumann's Process 1).

As far as I understand, decoherence works without needing a "real" collapse. The wavefunction of the total system (q plus environment and measurement equipment) continues to evolve according to the deterministic equations of quantum mechanics (Von Neumann's Process 2) and the total information is conserved, but q alone appears to undergo a quantum jump and lose information. The evolution of q taken alone appears as effectively random, but "q taken alone" is an approximation, and the evolution of the total entangled system q+environment is not random.

Decoherence explains why the world seems classical instead of quantum (with the exception of very carefully prepared cases difficult to realize). Decoherence explains why Scroedinger's cat is always either alive or dead, never in a combination of alive and dead states, without requiring ad-hoc additions to the equations of quantum mechanics (such as Von Neumann's Process 1). There is no Process 1, only Process 2.

Questions: Given the success and popularity of decoherence theories, are there reasons to stick to Von Neumann's notion of random collapse? Do we still need an interpretation of quantum physics?

The authors of "Decoherence and the Appearance of a Classical World in Quantum Theory" (which I believe is the "standard" text) do not seem to entirely agree on these points. What do you guys think?

asked May 12, 2016 in Theoretical Physics by Giulio Prisco (190 points) [ no revision ]
Most voted comments show all comments

@ArnoldNeumaier ok, but this issue seems to be inherent to any situation
where some theoretically calculated predictions are experimentally
verified. It is not specific to QM.
 

@dilaton: Indeed. There is a voluminous literature about the classical
measurement problem: On the most fundamental level, see
https://www.goodreads.com/series/82607-foundations-of-measurement

More practically oriented, see, e.g.,
http://physics.nist.gov/cgi-bin/cuu/Info/Uncertainty/index.html
 

As it can be derived, the collapse need not be postulated. But it is there whenever the conditions for it are satisfied. (Some conditions are needed since the standard collapse postulate only holds under highly idealized conditions.)

No, decoherence unequivocally does NOT explain Von Neumann's Process 1. Rather, it explains when and why non-classical interference effects go away upon measurement or in macroscopic systems. It does not explain state reduction at all. Of course this is why Bohm first worked on decoherence in his pilot wave theory -- his theory explained state reduction, but he still needed to explain loss of interference of the pilot wave itself. Similarly with Everett, and pretty much any other interpretation. Decoherence is an integral part of most modern interpretations, but does not make any contact at all with state reduction.

''has the result be generalized since 1995?'' Their book contains a density matrix version, which is general enough, it seems to me, to cover many realistic situations. But I haven't yet had the time to understand all the details.

Most recent comments show all comments

@Guilio, to your first paragraph, that's why we need decoherence, which as I already stated, explains the loss of interference effects. To your last paragraph, Everettians don't think we need anything more, which is the whole point of their program. They argue that it is totally silly to posit anything more, since all you need is the universal wave function. Of course, you have to accept the consequences, which are many worlds, but most people who reject these for reasons of parsimony, plainly don't seem to understand where they come from.

@Arnold Neumaier, we've been over this stuff before (here and here), and it is clear now as it was then that you simply don't understand the Everett interpretation. Of course it is fine not to understand something, but you continue to speak as though you do. 

3 Answers

+ 4 like - 0 dislike


Very interesting discussion! Let me see whether I can contribute with something substantial.  

I agree that, accepting the validity of a unitary quantum description for all of physical reality, the decoherence phenomenon is a property of this description, and collapse becomes an emergent phenomenon, occurring in certain open subsystems. What remains instead of coherence, however, is probability, more precisely the probabilities of different classical outcomes, and I think the question which Arnold Neumaier has raised, namely whether probability also is emergent in some sense, is a very interesting one. 

Let me elaborate on this. Decoherence produces (locally) a set of decohered, effectively classical "histories" which, I believe, roughly can be identified with Everett's "many worlds". Quantum theory ascribes each of these a certain probability, instead of the certainty of classical reality which each of us experiences; and in each of the histories collapses will occur, together with an associated history branching. All of the histories are part of the unique "wave function of the universe", but once decohered, they do not dynamically influence each other anymore, and so our consciousness (like any physical process) cannot associate them with each other, i.e. we cannot be aware of more than one of them at a time. So, subjectively we always live in only one of "many worlds", and we experience all events that occur in it with certainty instead of probability (for instance, we observe the decay of a radioactive nucleus as an event which at any moment has either occured or not occurred instead of occurred with a certain probability). I believe this is also somehow the idea behind Wigner's "consciousness-induced collapse" and the "many-mind-interpretation". 
On the other hand, collapses observed within a given history will then appear stochastic and happen (under suitably controlled conditions) with certain reproducible frequencies, interpreted by us as "quantum jump probabilities". Therefore this latter kind of "probability" is emergent from the underlying, continuous, decohering dynamics of the global "wave function of the universe". 

As you can see from the above, my feeling is that the "many-words", "many-minds", and "consciousness-causes-collapse" (and perhaps also "consistent-histories") interpretations in some deeper sense all are equivalent and just different aspects of the same fundamental interpretation whose basic premise is to ascribe ontic reality to some quantum mechanical state representing all of reality (the "wave function of the universe"). 
One may consider it an attractive property of this world view is that it grants all possible histories of the world equal reality, so in a way our quantum mechanical universe is not just Leibniz's "best of possible worlds" but much more, namely something like the "totality of ALL possible (classical) worlds". Equally well, the same property may be considered repelling, since it displays such an incredible wastefulness, with innumerable "histories" we cannot communicate with, and thus violates Occam's principle to the largest possible degree. 

Finally, my simple answers to Giulio's questions. 1. Do we need von Neumann's collapse as a physical process? No, I don't think so. 2. Do we need an interpretation of quantum physics? Sure we do, since the quantum mechanical state reductions leading to our unique classical reality must occur somewhere; even if we push them all the way up into the subjective mind of the observer, we cannot get rid of them altogether. And, even more fundamentally, we must decide whether we consider the quantum state ontic or epistemic, as user1247 has already pointed out repeatedly. 

answered Sep 19, 2016 by Dierk Bormann (70 points) [ no revision ]

Thanks @DierkBormann, good points. I think an interpretation is "nice to have" but perhaps not "need to have." I mean, decoherence studies show that environmental interactions quickly turn a pure state into a mixture: Instead of a quantum superposition, we have different possible outcomes and the theory only predicts the probabilities of the different outcomes.

And what's wrong with that? Randomness is fundamental and the laws of physics only predict probabilities. So what? We can still do physics, and understanding why we don't see the cat in a superposition alive+dead, but either alive or dead, should be good enough.

On the other hand, If the system (the observed subsystem plus the measurement apparatus plus the environment) evolves deterministically according to the equations of quantum mechanics, I am tempted to conclude that the observed outcome must be determined by some aspects of the environment (in-principle, though not in-practice). But perhaps that's just classical (non-quantum) intuition?


Thanks Giulio for a quick answer. Perhaps we have different ideas of what an "interpretation" is. I believe that we must decide, at a very fundamental level, whether the quantum mechanical description is epistemic (i.e., describes our *knowledge* about reality in some probabilistic way), or ontic (represents *directly* some aspect of reality). Imo this is already an act of interpretation. 

Now, of course there is nothing wrong with an epistemic interpretation, which I suppose you have in mind (and, as a matter of fact, the founding fathers of QM also had). The only trouble with this is that you need to define the notion of an "experiment" with definite classical outcomes, for which QM then can predict probabilities. As far as I can see, this definition lies outside the formalism of QM. 

Another possibility, often favored nowadays, is to ascribe the QM descripton ontic character, and applying it to all of reality, in order to avoid the necessity of an artificial boundary between "system" and "measurement apparatus". Assuming that the formalism of QM (including unitary evolution of the state vector etc.) is essentially correct, this interpretation logically leads to the "many-worlds" or equivalent scenarios. Decoherence is a property of the mathematical formalism which leads to the branching into an ever increasing multitude of effectively decoupled "world histories" (or "worlds" for brevity). All these "worlds" are, however, different aspects of a single QM state (traditionally coined the "wave function of the universe"). For all practical purposes different "world histories" do not interfere with each other anymore at later times once they have decohered, i.e., they continue evolving independently. One can also explain mathematically why they appear *classical* (this phenomenon is sometimes called environmetally induced superselection). 

Now the "new" (?) idea which I wanted to throw into this discussion is the following. Since the conscious mind of an observer imo is a physical process like any other, it must evolve independently within any individual "world history" without "knowing" about the all other ones. So, *subjectively* the outcome of any experiment is experienced as definite and not probabilistic (whereas *objectively* all possible outcomes are there), and the observed "state reduction" by measurement (or any other interaction with the environment) is emergent. I believe these or similar ideas must lie behind the popular "many-mind" picture. I also believe that Everett already had something like this in mind, at least reading his original papers I feel they resonate very strongly with these ideas. 

Although I agree that, in the ontic interpretation, the "universe" (subsystem plus measurement apparatus plus environment) evolves deterministically according to QM, I wouldn't say that the "observed outcome" is "determined by the environment". Instead, *all* possible outcomes actually are observed, only in different world histories, i.e. different branches of the "wave function of the universe" (one might say, by different observers residing in "parallel worlds"). This is why I called this scenario "wasteful" and "violating Occam's principle" and why, although it's very elegant, I haven't made up my mind yet whether I really believe in it. 

Any more thoughts about this Giulio? (or anybody else?) I'd love to hear your opinion. Best wishes.  Dierk

Hi @DierkBormann, yes I have lots of thoughts about this but little time to write them down, and my thoughts are not yet clear enough anyway. I hope to post something soon. Two quick replies: Given the ontic/epistemic polarity I tend to consider the wavefunction as ontic, but the difference could be less sharp than we think. I also like the "many-minds" concept. If Everett's relative state formulation is an interpretation of quantum mechanics, "many worlds" is an interpretation of Everett, and I tend to consider "many minds" as a better interpretation, closer to what Everett really said.

Dear Giulio, I believe I understand you very well (both concerning the difficulty to find time to write, and also the fuzzi-/elusiveness of ideas about the fundamentals of QM  ;-) ).

But I wonder, how can the contrast between ontic and epistemic interpretation of the wavefunction not be sharp? Wouldn't you agree that the wavefunction must either describe (some aspect of) objective reality or not? If it does, I call it ontic, and I can use QM to describe (some aspect of) ALL of objective reality, i.e. the whole universe, as Wheeler and Everett did. If it doesn't, then it's epistemic, and I need in addition to QM the concept of a "classical" observer (or measurement device) to even make sense of the wave function. The wavefunction then somehow encodes the "knowledge" which the observer (or device) has about reality, which was sort of the default approach of the founding fathers of QM, who didn't expect QM to be universally valid -- in fact didn't even expect its validity to transcend the microscopic domain. 

As to the "many minds" interpretation my feeling is, as I have said before, that it migth become essentially equivalent to "many worlds" and "consistent histories", whenever either of them is logically thought through to the end. However, I'm not sure whether Everett himself would have subscribed to this point of view, at least I haven't found any indication in his papers that he seriously thought about what the subjective experience should be of a conscious, observing mind that is governed by the laws of quantum mechanics. But then I read his papers a long time ago, perhaps I should re-read them now. 

+ 2 like - 0 dislike

Having done more reading and thinking, and in order to restart this interesting discussion, I propose tentative answers to my question, which were:

"1. Given the success and popularity of decoherence theories, are there reasons to stick to Von Neumann's notion of random collapse? 2. Do we still need an interpretation of quantum physics?"

The answers seem to be: 1. No; 2. Not necessarily.

Decoherence explains the appearance of collapse without Process 1, and the high-level experimental fact - which is at the origin of the quantum measurement debate - that we don't see macroscopic superpositions (besides carefully controlled cases in the laboratory).

If the non-occurrence of macroscopic superposition is considered as the only serious problem (that is, if one is happy to live with non-locality, effective randomness and other "weird" quantum behaviors), then everything seems to work well enough without an "interpretation" (besides, as @ArnoldNeumaier notes, a minimalist consensus interpretation to map the mathematical formalism to reality).

However, decoherence is not incompatible with existing interpretations, but can be included in existing interpretations, including Everett's MWI and von Neumann-Wigner's consciousness-casees-collapse interpretation.


 

answered Sep 12, 2016 by Giulio Prisco (190 points) [ no revision ]

A "minimalist" interpretation is an interpretation. There is a class of interpretations that might claim to be minimal that have, for example, different interpretations of the mathematics of classical probabilities as frequencies, propensities, et cetera, reflecting, inter alia, varying degrees of realism about (spaces or manifolds of) states, processes, algorithms, information. A claim that some part of a theory is just information (about the world or only about experimental configurations), for example, rather than something more than "just" information, is an aspect of an interpretation.

There is no consensus unless you take 50.0001% to make it so; or perhaps it's enough for 50.0001% of physicists to say that there is a consensus (which might attract arbitrarily little support, provided it has more support than other proposals). My response to the panoply, FWIW, is to take various interpretations to be in some senses equal, in other senses not.

@PeterMorgan - Agree. I don't see how a physical theory can be "something more than 'just' information," but as you say, this is an interpretation.

The collapse is still practically useful, as it summarizes (in the cases where it applies) in a simple way what happens to the open system if the environment is present but ignored.

I would say: "The collapse is always happens to a micro-system as the environment is the source of this micro-system ;-)

@ArnoldNeumaier - yes, collapse is practically useful, but is it fundamental? According to decoherence results, it doesn't seem so. I believe you think the same, and at this moment I feel almost persuaded. Too bad, because for various philosophical (and probably emotional) reasons I _like_ the idea of fundamental randomness...

collapse is not fundamental, but emergent. I also think that probability is emergent; my own QFT based thermal interpretation is not random. But emergence of probability cannot be due to decoherence. Instead it is due to the impossibility to give  a precise meaning to fundamental (and hence objective) randomness. This is the ultimate reason for the uneasiness that results in the multitude of interpretations. 

I wish to reopen this discussion by tweaking (or at least adding qualifiers to) my previous self-answer.

Decoherence: for a system S and a detector D interacting with environment E (modeled as a random environment), the off-diagonal terms of the reduced density matrix of S+D,expressed in a preferred basis of pointer states and traced over the unknown states of E, rapidly go to zero and we are left with an ensemble of "classical" states SD(i).

So far so good, but we still see only one of the possible states. It seems to me that we should either:

1) Add an independent collapse to reduce the ensemble of possible outcomes to a single outcome.

2) Accept a MWI-like ontology where all possible outcomes are real in separate Everett branches.

3) Conclude that the observed outcome must be determined by some aspects of the environment.

Thoughts?

+ 0 like - 0 dislike

Classical behavior is not just decoherence, but an inclusive (average) picture. You need many-many experiments to join in one average picture and ascribe its mean value to a classical (deterministic) "outcome" ;-)

Classical behavior is a behavior of the mean value. Randomness never go away in QM (and in CM either).

answered Sep 12, 2016 by Vladimir Kalitvianski (102 points) [ no revision ]

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsOverflo$\varnothing$
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...