The Everett Interpretation

When one introduces hidden-variables or state reduction, certain kinds of physical quantities (the “preferred” ones) get to be value-definite - among them the observed quantities (quantities like position, which are well-localized in space). Eschewing hidden-variables or state-reduction, still we have to pick out preferred quantities. How? And precisely which ones? This is the preferred basis problem. The tightrope that must be walked (if we are to make sense of quantum mechanics without hidden-variables or state reduction) is to show first, how certain sorts of quantities get to be preferred (the preferred basis problem), and second, how particular values get to be assigned to such quantities (the problem that going over to many worlds or - as has been suggested by Albert and Loewer, in an approach which has received a lot of subsequent attention - going over to a many minds approach is supposed to solve).

The first is the problem that has been attacked by the physicists. They have made systematic progress with it: it is exactly the business of decoherence theory to extract “effective” equations of motion, concerning those dynamical variables for which value-assignments can be made in a way which is stable over time, and without “interference effects” linking their different values. And indeed it turns out that the variables arrived at in this way are well-localised in space. But decoherence theory does not solve the preferred-basis problem on its own. One question that remains is why, even given that such-and-such a basis decoheres, should that be the basis that we see? My suggestion is that complex functional systems, capable of encoding records and processing information, can only be constituted out of stable sequences of states - and stability, and indeed the possibility of reliably encoding records, is only possible with decohering sequences of states. And of course anything like an observing system - computer, amoeba, or human - is going to have to be a complex functional system.

For an explanation of why complex adaptive systems should be composed of decohering variables (in the sense of the consistent histories approach) see my:

  • "Decoherence and Evolutionary Adaptation", Physics Letters A 184 (1994), p.1-5.
  • "Decoherence, Relative States, and Evolutionary Adaptation", Foundations of Physics, 23 (1993), 1553-1585.
  • For more formal properties of decohering variables, and specifically a proof that decoherence (in the sense of consistency) implies that value-definiteness as a relational construct is transitive (in general it isn 't), see my:

  • "Relativism", in Perspectives on Quantum Reality, R. Clifton, ed., Kluwer, Dordrecht (1995), p.125-42.
  • Physicists today are largely agreed that an approximate, FAPP basis can be defined for its intrinsic utility in quantum mechanics: among the various patterns one can make out in the universal state, the ones governed by effective equations, as defined in the decoherence basis, are the most important. But it may still be thought there is a problem of intelligibility: what does it mean to say worlds divide, or are subject to branching? What does it mean to say there are other worlds at all, when ours alone appears real?, In what sense is our world even apparently privileged? But here there are analogous and well-known - indeed ancient - problems in the philosophy of time. What does it mean to speak of time's passage? What does it mean to say "the now" is what is real? These and other parallels are developed at length in my

  • "Time, Quantum Mechanics, and Decoherence", Synthese, 102, 235-66 (1995), which you can download here as a pdf file.
  • The other great bugbear of the Everett interpretation is the interpretation of probability. The fundamental difficulty, if one really tries to make do with pure quantum mechanics, is that there is no univocal criterion of identity over time, not of persons and not of objects; or none that is locally coded into the formalism. Most philosophers of physics writing on this subject are sure this is needed (physicists tend not to think about it): for how else are we to make sense of talk of what we will see? (We, the very same?) Or of what the apparatus, the very same, will record? But I think the notion of personal identity already collapses in classical physics. Identity over time does not have to have the formal properties of identity.

    For a systematic account of probability in the Everett interpretation see my:

  • "Time, Quantum Mechanics, and Probability", Synthese, 114 (1998), 405-44 (also available at http://xxxx.arXiv.org/abs/quant-ph/0111047). You can download it here as a pdf file.
  • The essential ideas of this paper are that the right attitude to have to the immenent prospect of branching is uncertainty; that physical characteristics of branches should dictate one's expectations; and that an account of objective chance in terms of relative norms of branches is no worse off than any of its competitors. Relative frequencies of outcomes, on repeated trials, remain as ever our guide to quantifying objective chance; but for well-known reasons (independent of the Everett interpretation) cannot be identified with them.

    I still believe that argument is substantially correct - but I did not expect to find positive arguments as to why the relative norms of branches are what matters. Here Deutsch's argument of 1999 from decision theory came as a welcome surprise; Wallace's more detailed treatment is a revelation. It is something of a footnote to note (if the Everett approach is correct) that the key ideas are enough to force the Born rule on purely operational assumptions. (The argument has the merit of applying to every school of foundations that shares these operational assumptions.) For the details, see my

  • "Derivation of the Born Rule from Operational Assumptions", Proceedings of the Royal Society of London , A, 460, 1-18 (2004) (available at http://xxxx.arXiv.org/abs/quant-ph/0211138, but with a trivial but mildly embarrassing error in one of the proofs). You can download it here as a pdf file.

    What becomes clear is that probability in the Everett interpretation is not a fundamental concept - equivalently, that it only has meaning in the context of decoherence. Probabilistic events at the sub-decoherence level are entirely a matter of (and only have meaning insofar as) they are correlated with decohering states. That in turn raises a question mark over derivations of the Born rule (including mine, the Deutsch-Wallace theory, and Gleason's) which assume probability should make sense (and be well-defined) for arbitrary choices of basis. But in fact it need not, as my derivation of the Born rule for a basis fixed once and for all shows. It is contained in my coming article

  • "What is Probability?", in Quo Vadis Quantum Mechanics, Elitzur, S. Dolev, and N. Kolenda, eds., Springer (2005), which you can by download here as a pdf file.

    In this article the lattice theoretic, formal properties of the basis required can be motivated directly by the approximate nature of decoherence. There too I show how one can derive Lewis's "principal principle", along lines first sketched out by Wallace. Wallace's work has informed most of my recent thinking about many-worlds: you can find his papers on the Los Alamos archives, in print in Studies in the History and Philosophy of Modern Physics, and on his website

    For difficulties with "one-world" interpretations of the histories formalism see also my:

  • "Space-Time and Probability", in Chance in Physics: Foundations and Perspectives, J. Bricmont, D. Dürr, M.C. Galavotti, G. Ghirardi, F. Petruccione, N. Zanghi (eds.), Springer-Verlag, (2001),which you can download here as a pdf file.
  • The interpretation of probability in the Everett interpretation has long been thought it's weakest link. On the contrary, it is one of the strongest points in its favour.

    Copyright Simon Saunders 2001. Last updated: 8 October 2004.