• Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.


PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback


(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,075 questions , 2,226 unanswered
5,348 answers , 22,757 comments
1,470 users with positive rep
818 active unimported users
More ...

  Is quantum randomness fundamental?

+ 3 like - 0 dislike

Quantum systems undergo two types of evolution in time: deterministic evolution governed by the equations of quantum physics, and quantum jumps upon measurement (aka reduction of the state vector or collapse of the wave function). I believe the consensus, or at least the majority view, is that quantum jumps are fundamentally random.

I am intrigued by a comment by @ArnoldNeumaier in another thread: "The notion of fundamental randomness is on logical grounds intrinsically meaningless. I.e., one cannot in principle give an executable operational definition of its meaning. I therefore believe that quantum randomness is not fundamental but a consequence of a not yet found highly chaotic deterministic description."

I wish to ask @ArnoldNeumaier and others to clarify and expand.

asked Nov 1, 2015 in Theoretical Physics by Giulio Prisco (190 points) [ revision history ]

3 Answers

+ 2 like - 1 dislike

On the logical, i.e., mathematical level, probability is specified by a probability space consisting of a space $\Omega$ of all conceivable experiments relevant in the situation to which probability is applied, a sigma algebra of measurable subsets of $\Omega$, and a probability measure assigning a probability to each measurable subsets. One can then define random variables as numbers $x(\omega)$ that depend on the experiment $\omega\in\Omega$ (formally measurable functions from $\Omega$ to the complex numbers), their expectations (formally integrals with respect to the measure), their variance, standard deviation, etc., and hence make the usual statistical predictions together with error estimates.

Thus from a purely logical point of view, probabilities are statements about sets of experiments called (in physics) ensembles. Talking about the probability of something always means to embed this something into an imagined ensemble of which this something is a more or less typical case.  Changing the ensemble (for example by assuming additional knowledge) changes the probabilities and hence the meaning. In mathematics, this is modeled by the concept of a conditional expectation - the condition refers to how the ensemble is selected.

Without stating the condition and hence specifying the ensemble, nothing at all can be predicted by a probabilistic model. Given the model (i.e., having fixed the ensemble) one can, however, predict expectations and standard deviations of random variables. By the law of large numbers, these predictions are valid empirically (and hence operationally verifiable) when the expectation is replaces by sufficiently many independent identically distributed realizations of the experiment. 

The fact that this ensemble is necessarily imagined, i.e., must be selected from an infinity of possible conditions, implies already that it depends on the subject that uses the model. (There are so-called non-informative priors that attempt to get around this subjectivity, but choosing an ensemble through a non-informative prior is still a choice that has to be made, and hence subjective. Moreover, in the cases of interest to physics, noninformative priors usually do not even exist. For example, there is no sensible noninformative prior on the set of natural numbers or the set of real numbers that would define a probability distribution.)

Objectivity (and hence a scientific description) arises only if the ensemble is agreed upon. This agreement (if it exists) is a social convention of the scientists in our present culture; to the extent that such an agreement exist, a model may be considered objective. 

Even within the limited domain of objectivity within the social convention of our present culture, verifying a probabilistic model requires the ability of performing sufficiently many independent realizations of the experiment. This is given in case of microscopic quantum mechanics since the models there are about molecular or submolecular entities and these are (according to our present understanding of the laws of Nature) identical throughout the universe. This makes it feasible to prepare many microscopic systems  independently and with identically distributed to sufficient precision for checking predictions. 

However, when applied to macroscopic systems, this is no longer the case. Already for a gram of ordinary matter (e.g. pure water) we (or any conceivable subject confined to our universe) can prepare only the macroscopic (hydrodynamic) degrees of freedom. And it is imposssible to make independent replications of the Earth, the Sun, or our Galaxy. These are unique objects, for which a probabilistic analysis is logically meaningless since the size of the corresponding ensemble is 1, and no law of large numbers applies.

The way we apply statistical mechanics to the sun is by predicting not the state of the sun but the state of tiny few body systems modeling nuclear reactions within the sun. Similarly, the statistical mechanics of galaxies used in cosmology reduces the number of degrees of freedom of each galaxy to a few numbers, again reducing 'our' galaxy to an anonymous typical element of a valid ensemble. 

Thus on the level of individuals, probability concepts lose their logical basis and their operationally verifiable character.

Now consider a model of the whole universe. Since it contains our unique Earth, and since we may define our universe as the smallest closed physical system containing the Earth, our universe is unique. It has the same character as the Earth, the Sun, or the Galaxy. By the same argument as above, one can apply statistical concepts to little pieces of the universe, but not to the universe itself. Therefore probability concepts applied to the universe have no logical basis and no operationally verifiable character. Using them in this context is logically meaningless.

This may be the reason why the Many Worlds view is popular. This view asserts (without any experimental evidence) that our universe is only one from many other universes that are  independent and identically distributed. However, nobody ever spelled out a sound probability distribution for the resulting ensemble. There are infinitely many choices, and all of them have exactly the same observable consequences - namely none at all. For whatever we observe experimentally is an observation about some random variables of our own universe, hence of the unique universe that we happen to be in. It is absolutely impossible to observe anything about any other of the assumed universes; since observation requires interaction, and by definition, a closed system doesn't interact with anything outside it. 

Hence we cannot check any nontrivial assertion about the ensemble of universes. This makes any discussion of it purely subjective and unscientific. The Many Worlds view may be popular but it has nothing to do with science. It is pure science fiction.

Since, as we saw, probability concepts applied to the universe are logically meaningless, any logically sound theory of everything must necessarily be deterministic. That we haven't yet found one that is satisfying is the usual course of science; it implies that we are not yet at the end of the road of discoveries to be made.

answered Nov 1, 2015 by Arnold Neumaier (15,787 points) [ no revision ]
Most voted comments show all comments

As I understand it, it is important to discern between the Everettian many worlds quantum interpretation and other situtations where more than one universe is considered too.
For example in the context of the string-theory landscape as the ensemble of all possible solutions to the governing equations of string theory, a statistical explanation of the characteristics of our universe could be more well-defined.

@ArnoldNeumaier re "The Many Worlds view may be popular but it has nothing to do with science. It is pure science fiction."

I will study your answer tomorrow, but wish to comment quickly on this point.

Everett's MWI (or the "many minds" variant that is, I believe, what Everett really had in mind) provides a conceptually economical interpretation of reality that not only does away with the need for quantum jumps, but is also useful to avoid many paradoxes that plague other interpretations, so it can't be "pure science fiction." Saying that it's science fiction is like saying that complex numbers are science fiction because you can't measure i with a stick.


1. Everett's MWI is perhaps popular also because it promises to be a conceptually economical interpretation of reality that not only does away with the need for quantum jumps, but is also useful to avoid many paradoxes.

Unfortunately, Everett's argumentation is flawed by a well-disguised circularity, and hence cannot serve as a logically valid foundation. His analysis simply derives the projection postulate by having assumed it, without any discussion, in disguise. See my answer at http://www.physicsoverflow.org/33940.

2. Many Minds is very different from Many Worlds, since it makes the interpretation subjective (depending on the minds) and leaves open how the minds evolve in a unique World in such a way as to produce this subjectivity.

3. The complex number $i$ is not a statement about physics, hence cannot be said to be science fiction (unless all of mathematics is). But the MWI makes claims about the real world, and surely the possibility of checking something in physics by experiment is one of the hallmarks of science.

@VladimirKalitvianski - I fail to see your point. Science is all about describing the universe. Any finite description is likely to be incomplete, but what we can describe permits doing useful things and building useful machines like the PC where I am typing this answer. Without the generations of scientists who dedicated their life to describe the universe, all I could send would be smoke signals (and not even that - mastering fire was also a scientific process).

@ArnoldNeumaier (re: MWI/string-landscape) Actually, that's exactly the point - the string landscape has nothing to do with Everett's many worlds; different theories in the string landscape are simply different theories, just like Type I is different from Type IIA in the 10 dimensional landscape.

I don't get what you meant when you said MWI was "science fiction", but MWI is, by definition, an interpretation of Quantum Mechanics. It isn't meant to make predictions of its own, and is intrinsically different from Bohmian mechanics and the other attempts to make deterministic formulations of QM, in that it's not a different theory; it's the same thing, makes the same predictions as QM.

Most recent comments show all comments

@Void: I'll be more precise, so that you can see that there is nothing violating causality or causal locality. (See this post for my distinction between causal nonlocality and Bell nonlocality.)

We (the congregation of scientists) induct from what is in the collective memory of the past causal cone of the Earth to the laws of science, which are then postulated to be universally valid until limitations are noticed, through comparison with collective memories of the past causal cone of the Earth at a later time. This is a necessity if we assume that relativity is universal. Indeed, the latter is a universal law that was inducted in precisely this way, and all physical induction ever done was also done in this way.

Moreover, this is mathematically fully well-defined and sociologically fully analogous to how Galileo Galilei inducted from his experiments the first modern laws of physics. 

It is well-known that Bohmian mechanics provides a deterministic substructure from which quantum mechanics can be derived by plausible reasoning. However, besides being awkward and counter-intuitive in my view it cannot be said to be a fundamental description since its ontological picture for (noninteracting) quantum fields and particles are incompatible, and since it has no coherent notion for interacting quantum fields.

Moreover, my argument that a fundamental description of the whole universe is necessarily deterministic is completely independent of any discussion of nonlocality. 


Of course, due to relativity, the observer cannot have complete knowledge of the present, whence the practical predictions must necessarily be done in the presence of uncertainty, which explains the probabilistic outcome.

I interpret "present" in your text as "the state of the whole universe at some time $t=t_0$". But this is obviously Lorentz-violating.

To predict some physical occurrence at an event $p$ at some time-like separation from you, you should need only information from a finite space-like patch, the intersection of the causal past of $p$ with some arbitrary space-like hypersurface around you, not from the whole universe.

If you need information extending beyond the causal past of $p$, then you have a problem either with Lorentz violation or the order of causation. I.e., if particle $A$ formally caused particle $B$ to turn out with an opposite spin by causation at a space-like separation, then there will necessarily exist a frame in which it will seem that it was actually $B$ that caused $A$ to turn out with the opposite spin. The only way out of this is violating the equivalence principle and the existence of a privileged class of frames giving the "global time". But for this to be physically real, such a violation must be physically measurable thus leading to Lorentz-violation.

If you take a look at your answer, you will find out that the "cosmological law" argumentation is sound, but if and only if it applies to the whole 4D universe-block, i.e. the whole temporal evolution of the universe as one object. This is because "frozen universe moments" are parts of universe very much like the Earth or the Sun, there is simply more than one specimen to compare.

But when you think about what the "block-universe law" is and what it should say, you find out that the same features that lead to no-probability also lead to no-science. (As I have already stated in the first comment.)

+ 0 like - 0 dislike

The thing is that probability is very useful as a model, but it's hard to pin down what it really means.

What does it mean to say that the probability of the event $A$ is $p$?

Supposedly, it means that if you were to repeat the corresponding probability experiment over and over, then $A$ would in the limit occur a fraction $p$ of the time. (Well, with probability 1 this would happen...)

But what does this subjunctive statement mean physically? After all, you can't repeat anything infinitely many times. Well, if you repeat it many times then probably $A$ occurs about a fraction $p$ of the time. But that's circular -- what does "probably" mean?

answered Apr 27, 2017 by Bjørn Kjos-Hanssen (10 points) [ no revision ]

There are cases where "many times" is meaningful, for example, there are rather many photons on the interference picture.

The trick with probabilities has another "dimension". It is about roughness (or approximativness) of our "identical" events. We consider different things (different apples, for example) as identical and then we may apply mathematics to count them. As soon as we start to distinguish our apples, we cannot count them.

Re "Well, if you repeat it many times then probably A occurs about a fraction p of the time."

You don't need "probably" here, "about" is enough.

@GuilioPrisco if we repeat 1,000 times it doesn't follow that $A$ occurs anywhere close to a fraction $p$ of the time. $A$ could occur 0 times in 1,000...

@GiulioPrisco I guess what you're proposing is something like Cournot's Principle.

+ 0 like - 1 dislike

Yes, it is fundamental. If you decrease the intensity of light on a screen, it will be represented as random dots. Deterministic is an average picture. For making an average, it does not matter in what order the dots are collected and averaged - by definition of an average value.

P.S. As I said, deterministic is an average picture, thus non fundamental. And quantum randomness is due to quantum system (environment) that participates in creating one event. The wave function is determined with all points of "space"; thus all of them participate in creating the interference picture.

answered Nov 1, 2015 by Vladimir Kalitvianski (102 points) [ revision history ]
edited May 2, 2017 by Vladimir Kalitvianski

I don't think this is what the question was asking for. The question is regarding whether hidden variable theories can be right.

@dimension10 - yes, the question is related to whether hidden variable theories can be right. However (just thinking aloud) perhaps the two formulations are not equivalent. Consider this hidden variable theory:

The state of a quantum system has two parts. One evolves according to the equations of quantum physics. The other represents a little demon who senses when a "measurement" is taking place and flips a little coin to determine the outcome. Then the state of the quantum system jumps according to the demon's random choice.

This qualifies as a hidden variable theory but randomness is still fundamental.

@GiulioPrisco Right, but it would still be possible to write a wavefunction, except it would no longer obey the standard state equations of QM. I think when most people talk about hidden variable theories, they necessarily talk about deterministic theory. At least, that was my usage in my previous comment.

One can speak of "some" hidden variables without knowing all the variables of the presumed deterministic theory. Having only a part of them may not exclude an apparent randomness. The EPR claim is not that nature is not quantum, it is just about the Bohr interpretation. If one finds the same result than QM while using shared variables, even with a random function, the EPR claim would not be invalidated by the Bell theorem. The latter assumes fundamentally that a not Bohrian theory cannot render the good results. Additionnaly, it also assumes that the detection is perfect, mainly for Einstein defenders. I never saw an experiment with 100% detection. Extraordinary claims require perfect proofs.

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification

user contributions licensed under cc by-sa 3.0 with attribution required

Your rights