Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

178 submissions , 140 unreviewed
4,360 questions , 1,685 unanswered
5,108 answers , 21,716 comments
1,470 users with positive rep
652 active unimported users
More ...

  Given decoherence, do we still need random quantum jumps and interpretations?

+ 5 like - 0 dislike
966 views

Environmental decoherence explains how the wavefunction of a quantum system q, as a result of the inevitable interactions and entanglement with the environment, appears to collapse upon measurement (Von Neumann's Process 1).

As far as I understand, decoherence works without needing a "real" collapse. The wavefunction of the total system (q plus environment and measurement equipment) continues to evolve according to the deterministic equations of quantum mechanics (Von Neumann's Process 2) and the total information is conserved, but q alone appears to undergo a quantum jump and lose information. The evolution of q taken alone appears as effectively random, but "q taken alone" is an approximation, and the evolution of the total entangled system q+environment is not random.

Decoherence explains why the world seems classical instead of quantum (with the exception of very carefully prepared cases difficult to realize). Decoherence explains why Scroedinger's cat is always either alive or dead, never in a combination of alive and dead states, without requiring ad-hoc additions to the equations of quantum mechanics (such as Von Neumann's Process 1). There is no Process 1, only Process 2.

Questions: Given the success and popularity of decoherence theories, are there reasons to stick to Von Neumann's notion of random collapse? Do we still need an interpretation of quantum physics?

The authors of "Decoherence and the Appearance of a Classical World in Quantum Theory" (which I believe is the "standard" text) do not seem to entirely agree on these points. What do you guys think?

asked May 12, 2016 in Theoretical Physics by Giulio Prisco (185 points) [ no revision ]

I think decoherence is the correct way to explain classical behavior. Quantum information (i.e. amplitudes) is transformed to classical information (i.e. probabilities).  What remains 'mysterious' is that only one of the possible outcomes (of a measurement or other decohering process) actually becomes 'reality'.

Thanks for commenting @NUU - Supporters of Everett's MWI or its Many Minds variant (favored by Zeh I believe) would say that all possible outcomes are actualized. But my point is, if quantum mechanics works (experimental fact) and decoherence explains why the world we perceive is classical (solid theoretical conclusion), do we really need more?

(edited: typo)

I think, we don't need more. In fact, we need less. Less multiverse, multi-worlds or many-minds, less different interpretations. One 'final' interpretation of QM should be enough. The consistent histories approach, see e.g. http://quantum.phys.cmu.edu/CHS/histories.html, comes pretty close to 'final' in my view. The technicalities involved in decoherence and histories make it seem unnecessary complicated, though. 

Thanks @NUU. One more for my study list. What should I read first about the consistent histories approach?

To admins: By mistake I posted an answer instead of a comment. Now the answer is hidden, but I don't see how to delete it.

Just browse the site. There are FAQ, Griffiths' own book, and links to papers. If your time is limited - and whose time isn't ;-) - I'd read his latest paper http://arxiv.org/abs/quant-ph/0612065

Good that the site is back!

@NUU re "What remains 'mysterious' is that only one of the possible outcomes (of a measurement or other decohering process) actually becomes 'reality'." - Wouldn't it be more correct to reword as "What remains 'mysterious' is which one of the possible outcomes (of a measurement or other decohering process) actually becomes 'reality'."?

In other words: Does decoherence (off-diagonal elements of the density matrix vanishing, amplitudes becoming probabilities) imply that only one of the possible outcome is real, or do we need an extra statement that only one outcome survives environmental interaction?

If you reformulate your comments as an answer I can vote it up, there are no karma points for comments!

An interpretation is always needed to make contact between a
mathematical formalism and reality. Different people differ on the
meaning of the latter, hence there are different interpretations.
Decoherence doesn't change anything in this respect.

The collapse is needed in practice as an excellent approximation for the
behavior of certain open systems. Nobody wants to work all the time with
the state of the whole universe - and all smaller systems are open!

Somewhat complementary to what Arnold said, there is a not negligible
number of physicists who are satisfied with the "Shut up and calculate"
interpretation. For them, the contact to reality consists of the theory
making correct predictions for experiment, which QM obviously does very
successfully.

@dilaton: Even the shut-up-and-calculate people have to interpret the
experiment in terms of the calculations and/or conversely. But they
adapt it to the case at hand for which their calculations are applied
without attempting to create a whole philosophy around it.

@ArnoldNeumaier ok, but this issue seems to be inherent to any situation
where some theoretically calculated predictions are experimentally
verified. It is not specific to QM.
 

@dilaton: Indeed. There is a voluminous literature about the classical
measurement problem: On the most fundamental level, see
https://www.goodreads.com/series/82607-foundations-of-measurement

More practically oriented, see, e.g.,
http://physics.nist.gov/cgi-bin/cuu/Info/Uncertainty/index.html
 

I think the shut-up-and-calculate crowd have very good points, but they miss all the fun. On the one hand, physics is not about ultimate reality (what is that?) but about building models that work, so once you have a model that works, you should be happy, shut up, and calculate. But on the other hand, the best moments in physics are when, after a lot of studying and thinking, you feel closer to The Thing Itself.

@Arnold re "The collapse is needed in practice as an excellent approximation..." Sure thing. But does the inevitable environmental decoherence eliminate the need to postulate a real collapse, or not?

As it can be derived, the collapse need not be postulated. But it is there whenever the conditions for it are satisfied. (Some conditions are needed since the standard collapse postulate only holds under highly idealized conditions.)

Thanks Arnold, I will read the paper, but how can it be that a 1995 (21 years ago) paper that _derives collapse from the unitary dynamics of a bigger system_ (a totally spectacular result at a first glance) isn't better known? I guess that, as you say, the conditions are too restrictive, but has the result be generalized since 1995?

The authors don't claim it solves the problem - they were not addressing the measurement problem at all. And their solution is hidden among long computations in the different (though related) context of nonequilibrium reduced models. That's why it has remained obscure. It is my observation, not theirs, that they solved the problem. I was very impressed by their paper. (Note that work on the foundations of quantum mechanics is little esteemed in the literature, and many never spend a single thought on it (after having gone through one of the possible initiations.)

No, decoherence unequivocally does NOT explain Von Neumann's Process 1. Rather, it explains when and why non-classical interference effects go away upon measurement or in macroscopic systems. It does not explain state reduction at all. Of course this is why Bohm first worked on decoherence in his pilot wave theory -- his theory explained state reduction, but he still needed to explain loss of interference of the pilot wave itself. Similarly with Everett, and pretty much any other interpretation. Decoherence is an integral part of most modern interpretations, but does not make any contact at all with state reduction.

''has the result be generalized since 1995?'' Their book contains a density matrix version, which is general enough, it seems to me, to cover many realistic situations. But I haven't yet had the time to understand all the details.

Thanks @user1247. Re "Decoherence is an integral part of most modern interpretations, but does not make any contact at all with state reduction..." - But it seems to me that it does, and quite a hard contact: decoherence lets one argue that there is no state reduction and no Process 1, only Process 2. From the OP: "The evolution of q taken alone appears as effectively random, but "q taken alone" is an approximation, and the evolution of the total entangled system q+environment is not random." At least that's how I understand it.

Thanks Arnold. "Their book" is "The Theory of Open Quantum Systems," right?

Yes, this is their book. A piecewise deterministic process is a combination of unitary evolution and random quantum jumps, so this must be derived from the unitary dynamics of a bigger system. Section 6.6 is the generalization I talked about. It should be read in conjunction with Section 6.3, which contains the case of pure states (essentially as in the 1995 paper).

@Giulio, you are describing Everett's universal wave function interpretation, not decoherence per se. Decoherence does not weigh in on whether the wave function is ontic or epistemic, all it says is that it decoheres. Yes, if you accept that a universal wave function exists and never undergoes states reduction, then this is all you need, as long as you are willing to accept the consequences (ie Many Worlds). This is why we consider Many Worlds an interpretation and decoherence not -- it weighs in on whether the wave function is ontic or epistemic.

@user1247 - if you accept that a universal wave function exists and never undergoes state reduction, you need to explain how it comes that we don't see the cat in a superposition of dead and alive states. Everett doesn't really _explain_ that - he just posits that the observer splits into sub-observers that experience different sub-realities, leaving open the questions of what happens, where, and when. In passing, I have the impression that what Everett really said is also an open question, and perhaps what he had in mind is many-minds rather than many-worlds.

From the preface of "Decoherence and the Appearance of a Classical World in Quantum Theory":

"Evidently, collapse models are meant to describe real physical states in terms of wave functions. They attempt to explain definite measurement outcomes by means of speculative stochastic dynamical terms added to the Schrodinger equation. Environmental decoherence is instead based on entanglement as a crucial element of reality, as a consequence of a universal unitary dynamics. Subsystems then no longer possess their own states, but may be effectively described by "apparent" ensembles of wave functions. Since the latter are essentially the same as those expected to result from collapse models, they effectively describe the measurement process dynamically in quantum mechanical terms. However, if one is not willing to accept an Everett-type interpretation, an essential element of description, such as an objective collapse, is still missing - although it now seems clear that such a collapse is required neither at the microscopic level nor in a measurement device."

I take the last point - "it now seems clear that such a collapse is required neither at the microscopic level nor in a measurement device," - as the key point. If unitary quantum mechanics plus decoherence (which is not an independent assumption or interpretation, but is derived from quantum mechanics itself) alone explain observed reality, why do we need something more, and what more do we need?

@user1247: The existence of a state for the universe does not have the many worlds interpretation as a consequence - by the same token, the existence of a wave function for the hydrogen atom would imply the existence of many hydrogen atoms, which is obvious nonsense.  How could a particular interpretation ever be a _consequence_ of the fact that all of the unverse can be described in terms of quantum mechanics?

Physics is not described with mathematics. It is we who "describe" some phenomena with mathematics and it is only possible because we do not care much about precision. Thus "slightly different" things are treated by us as equal and then mathematics comes into play - we can count.

Any phenomenon in physics must be in principle repetitive, - one cannot derive a "law" from one point or from one bit of information. So we always study some limited quasi- repetitive subsystems in physics. Thus, the whole Universe is not a subject of physical studies, but a subject of endless speculations. Physics, as a "quantitative" science, has a rather little choice of subjects in respect to all subjects in nature. As soon as "uncertainty" becomes measurable and important, we get lost in descriptions and in "interpretations" because we want certainty!

@Guilio, to your first paragraph, that's why we need decoherence, which as I already stated, explains the loss of interference effects. To your last paragraph, Everettians don't think we need anything more, which is the whole point of their program. They argue that it is totally silly to posit anything more, since all you need is the universal wave function. Of course, you have to accept the consequences, which are many worlds, but most people who reject these for reasons of parsimony, plainly don't seem to understand where they come from.

@Arnold Neumaier, we've been over this stuff before (here and here), and it is clear now as it was then that you simply don't understand the Everett interpretation. Of course it is fine not to understand something, but you continue to speak as though you do. 

3 Answers

+ 4 like - 0 dislike


Very interesting discussion! Let me see whether I can contribute with something substantial.  

I agree that, accepting the validity of a unitary quantum description for all of physical reality, the decoherence phenomenon is a property of this description, and collapse becomes an emergent phenomenon, occurring in certain open subsystems. What remains instead of coherence, however, is probability, more precisely the probabilities of different classical outcomes, and I think the question which Arnold Neumaier has raised, namely whether probability also is emergent in some sense, is a very interesting one. 

Let me elaborate on this. Decoherence produces (locally) a set of decohered, effectively classical "histories" which, I believe, roughly can be identified with Everett's "many worlds". Quantum theory ascribes each of these a certain probability, instead of the certainty of classical reality which each of us experiences; and in each of the histories collapses will occur, together with an associated history branching. All of the histories are part of the unique "wave function of the universe", but once decohered, they do not dynamically influence each other anymore, and so our consciousness (like any physical process) cannot associate them with each other, i.e. we cannot be aware of more than one of them at a time. So, subjectively we always live in only one of "many worlds", and we experience all events that occur in it with certainty instead of probability (for instance, we observe the decay of a radioactive nucleus as an event which at any moment has either occured or not occurred instead of occurred with a certain probability). I believe this is also somehow the idea behind Wigner's "consciousness-induced collapse" and the "many-mind-interpretation". 
On the other hand, collapses observed within a given history will then appear stochastic and happen (under suitably controlled conditions) with certain reproducible frequencies, interpreted by us as "quantum jump probabilities". Therefore this latter kind of "probability" is emergent from the underlying, continuous, decohering dynamics of the global "wave function of the universe". 

As you can see from the above, my feeling is that the "many-words", "many-minds", and "consciousness-causes-collapse" (and perhaps also "consistent-histories") interpretations in some deeper sense all are equivalent and just different aspects of the same fundamental interpretation whose basic premise is to ascribe ontic reality to some quantum mechanical state representing all of reality (the "wave function of the universe"). 
One may consider it an attractive property of this world view is that it grants all possible histories of the world equal reality, so in a way our quantum mechanical universe is not just Leibniz's "best of possible worlds" but much more, namely something like the "totality of ALL possible (classical) worlds". Equally well, the same property may be considered repelling, since it displays such an incredible wastefulness, with innumerable "histories" we cannot communicate with, and thus violates Occam's principle to the largest possible degree. 

Finally, my simple answers to Giulio's questions. 1. Do we need von Neumann's collapse as a physical process? No, I don't think so. 2. Do we need an interpretation of quantum physics? Sure we do, since the quantum mechanical state reductions leading to our unique classical reality must occur somewhere; even if we push them all the way up into the subjective mind of the observer, we cannot get rid of them altogether. And, even more fundamentally, we must decide whether we consider the quantum state ontic or epistemic, as user1247 has already pointed out repeatedly. 

answered Sep 19, 2016 by Dierk Bormann (70 points) [ no revision ]

Thanks @DierkBormann, good points. I think an interpretation is "nice to have" but perhaps not "need to have." I mean, decoherence studies show that environmental interactions quickly turn a pure state into a mixture: Instead of a quantum superposition, we have different possible outcomes and the theory only predicts the probabilities of the different outcomes.

And what's wrong with that? Randomness is fundamental and the laws of physics only predict probabilities. So what? We can still do physics, and understanding why we don't see the cat in a superposition alive+dead, but either alive or dead, should be good enough.

On the other hand, If the system (the observed subsystem plus the measurement apparatus plus the environment) evolves deterministically according to the equations of quantum mechanics, I am tempted to conclude that the observed outcome must be determined by some aspects of the environment (in-principle, though not in-practice). But perhaps that's just classical (non-quantum) intuition?


Thanks Giulio for a quick answer. Perhaps we have different ideas of what an "interpretation" is. I believe that we must decide, at a very fundamental level, whether the quantum mechanical description is epistemic (i.e., describes our *knowledge* about reality in some probabilistic way), or ontic (represents *directly* some aspect of reality). Imo this is already an act of interpretation. 

Now, of course there is nothing wrong with an epistemic interpretation, which I suppose you have in mind (and, as a matter of fact, the founding fathers of QM also had). The only trouble with this is that you need to define the notion of an "experiment" with definite classical outcomes, for which QM then can predict probabilities. As far as I can see, this definition lies outside the formalism of QM. 

Another possibility, often favored nowadays, is to ascribe the QM descripton ontic character, and applying it to all of reality, in order to avoid the necessity of an artificial boundary between "system" and "measurement apparatus". Assuming that the formalism of QM (including unitary evolution of the state vector etc.) is essentially correct, this interpretation logically leads to the "many-worlds" or equivalent scenarios. Decoherence is a property of the mathematical formalism which leads to the branching into an ever increasing multitude of effectively decoupled "world histories" (or "worlds" for brevity). All these "worlds" are, however, different aspects of a single QM state (traditionally coined the "wave function of the universe"). For all practical purposes different "world histories" do not interfere with each other anymore at later times once they have decohered, i.e., they continue evolving independently. One can also explain mathematically why they appear *classical* (this phenomenon is sometimes called environmetally induced superselection). 

Now the "new" (?) idea which I wanted to throw into this discussion is the following. Since the conscious mind of an observer imo is a physical process like any other, it must evolve independently within any individual "world history" without "knowing" about the all other ones. So, *subjectively* the outcome of any experiment is experienced as definite and not probabilistic (whereas *objectively* all possible outcomes are there), and the observed "state reduction" by measurement (or any other interaction with the environment) is emergent. I believe these or similar ideas must lie behind the popular "many-mind" picture. I also believe that Everett already had something like this in mind, at least reading his original papers I feel they resonate very strongly with these ideas. 

Although I agree that, in the ontic interpretation, the "universe" (subsystem plus measurement apparatus plus environment) evolves deterministically according to QM, I wouldn't say that the "observed outcome" is "determined by the environment". Instead, *all* possible outcomes actually are observed, only in different world histories, i.e. different branches of the "wave function of the universe" (one might say, by different observers residing in "parallel worlds"). This is why I called this scenario "wasteful" and "violating Occam's principle" and why, although it's very elegant, I haven't made up my mind yet whether I really believe in it. 

Any more thoughts about this Giulio? (or anybody else?) I'd love to hear your opinion. Best wishes.  Dierk

Hi @DierkBormann, yes I have lots of thoughts about this but little time to write them down, and my thoughts are not yet clear enough anyway. I hope to post something soon. Two quick replies: Given the ontic/epistemic polarity I tend to consider the wavefunction as ontic, but the difference could be less sharp than we think. I also like the "many-minds" concept. If Everett's relative state formulation is an interpretation of quantum mechanics, "many worlds" is an interpretation of Everett, and I tend to consider "many minds" as a better interpretation, closer to what Everett really said.

Dear Giulio, I believe I understand you very well (both concerning the difficulty to find time to write, and also the fuzzi-/elusiveness of ideas about the fundamentals of QM  ;-) ).

But I wonder, how can the contrast between ontic and epistemic interpretation of the wavefunction not be sharp? Wouldn't you agree that the wavefunction must either describe (some aspect of) objective reality or not? If it does, I call it ontic, and I can use QM to describe (some aspect of) ALL of objective reality, i.e. the whole universe, as Wheeler and Everett did. If it doesn't, then it's epistemic, and I need in addition to QM the concept of a "classical" observer (or measurement device) to even make sense of the wave function. The wavefunction then somehow encodes the "knowledge" which the observer (or device) has about reality, which was sort of the default approach of the founding fathers of QM, who didn't expect QM to be universally valid -- in fact didn't even expect its validity to transcend the microscopic domain. 

As to the "many minds" interpretation my feeling is, as I have said before, that it migth become essentially equivalent to "many worlds" and "consistent histories", whenever either of them is logically thought through to the end. However, I'm not sure whether Everett himself would have subscribed to this point of view, at least I haven't found any indication in his papers that he seriously thought about what the subjective experience should be of a conscious, observing mind that is governed by the laws of quantum mechanics. But then I read his papers a long time ago, perhaps I should re-read them now. 

+ 2 like - 0 dislike

Having done more reading and thinking, and in order to restart this interesting discussion, I propose tentative answers to my question, which were:

"1. Given the success and popularity of decoherence theories, are there reasons to stick to Von Neumann's notion of random collapse? 2. Do we still need an interpretation of quantum physics?"

The answers seem to be: 1. No; 2. Not necessarily.

Decoherence explains the appearance of collapse without Process 1, and the high-level experimental fact - which is at the origin of the quantum measurement debate - that we don't see macroscopic superpositions (besides carefully controlled cases in the laboratory).

If the non-occurrence of macroscopic superposition is considered as the only serious problem (that is, if one is happy to live with non-locality, effective randomness and other "weird" quantum behaviors), then everything seems to work well enough without an "interpretation" (besides, as @ArnoldNeumaier notes, a minimalist consensus interpretation to map the mathematical formalism to reality).

However, decoherence is not incompatible with existing interpretations, but can be included in existing interpretations, including Everett's MWI and von Neumann-Wigner's consciousness-casees-collapse interpretation.


 

answered Sep 12, 2016 by Giulio Prisco (185 points) [ no revision ]

A "minimalist" interpretation is an interpretation. There is a class of interpretations that might claim to be minimal that have, for example, different interpretations of the mathematics of classical probabilities as frequencies, propensities, et cetera, reflecting, inter alia, varying degrees of realism about (spaces or manifolds of) states, processes, algorithms, information. A claim that some part of a theory is just information (about the world or only about experimental configurations), for example, rather than something more than "just" information, is an aspect of an interpretation.

There is no consensus unless you take 50.0001% to make it so; or perhaps it's enough for 50.0001% of physicists to say that there is a consensus (which might attract arbitrarily little support, provided it has more support than other proposals). My response to the panoply, FWIW, is to take various interpretations to be in some senses equal, in other senses not.

@PeterMorgan - Agree. I don't see how a physical theory can be "something more than 'just' information," but as you say, this is an interpretation.

The collapse is still practically useful, as it summarizes (in the cases where it applies) in a simple way what happens to the open system if the environment is present but ignored.

I would say: "The collapse is always happens to a micro-system as the environment is the source of this micro-system ;-)

@ArnoldNeumaier - yes, collapse is practically useful, but is it fundamental? According to decoherence results, it doesn't seem so. I believe you think the same, and at this moment I feel almost persuaded. Too bad, because for various philosophical (and probably emotional) reasons I _like_ the idea of fundamental randomness...

collapse is not fundamental, but emergent. I also think that probability is emergent; my own QFT based thermal interpretation is not random. But emergence of probability cannot be due to decoherence. Instead it is due to the impossibility to give  a precise meaning to fundamental (and hence objective) randomness. This is the ultimate reason for the uneasiness that results in the multitude of interpretations. 

I wish to reopen this discussion by tweaking (or at least adding qualifiers to) my previous self-answer.

Decoherence: for a system S and a detector D interacting with environment E (modeled as a random environment), the off-diagonal terms of the reduced density matrix of S+D,expressed in a preferred basis of pointer states and traced over the unknown states of E, rapidly go to zero and we are left with an ensemble of "classical" states SD(i).

So far so good, but we still see only one of the possible states. It seems to me that we should either:

1) Add an independent collapse to reduce the ensemble of possible outcomes to a single outcome.

2) Accept a MWI-like ontology where all possible outcomes are real in separate Everett branches.

3) Conclude that the observed outcome must be determined by some aspects of the environment.

Thoughts?

+ 0 like - 0 dislike

Classical behavior is not just decoherence, but an inclusive (average) picture. You need many-many experiments to join in one average picture and ascribe its mean value to a classical (deterministic) "outcome" ;-)

Classical behavior is a behavior of the mean value. Randomness never go away in QM (and in CM either).

answered Sep 12, 2016 by Vladimir Kalitvianski (132 points) [ no revision ]

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsOverflo$\varnothing$
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
To avoid this verification in future, please log in or register.




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...