Essays
by H.D. Zeh
by H.D. Zeh
by H.D. Zeh
by E. Joos
The Essence of the Concept of Decoherence
Decoherence is now often identified
with certain
phenomena that have either been experimentally observed or are relevant
in
practice for other reasons. This may be in accord with an operational
approach to quantum physics, but it tells us nothing about the basis of
this concept in the established formalism from that it had originally
been derived.
The second kind of phenomena are best known from attempts
to realize
quantum computers, where decoherence is regarded as an unwanted
"distortion" caused by the environment. This picture has led
to various unsuccessful attempts to construct "error correction codes"
by means of redundant information storage as in classical computers.
They can hardly ever be successful, since (1) quantum states cannot be
cloned,
while (2) genuine decoherence is an
irreversible process. Only "virtual decoherence", defined by means of
an unrealistic microscopic
environment, could be
reversed in practice. A similar reversability is used in delayed
choice
measurements or
so-called quantum erasers, where a virtual
measurement is
"undone" (cf. here).
The key experiments which have confirmed the phenomenon of
decoherence, on the other hand, demonstrate the
disappearance of an
otherwise observed interference pattern, usually for mesoscopic objects
whose effective
environment may be varied to exhibit
decoherence or not. Although this statistical phenomenon is indeed a consequence of
the corresponding decoherence process, it can be observed only for
ensembles of
measurement
results (such as many spots on a screen), while the irreversible
process of decoherence affects each individual measurement. In
these experiments, decoherence
affects the investigated mesoscopic systems twice:
first while they pass the slits of an interferometer or while they live
for a
short
time as "Schrödinger cats" in an isolated cavity, and for the
second time during the final measurement that leads to the
"spontaneous" appearance of individual
spots, bubbles, or clicks. Only the first decoherence process is
investigated in these
experiments, while for the measurement proper experimentalists
analyzing their results often forget what they just demonstrated, and
thus return to a
pragmatic statistical interpretation without referring to the
corresponding decoherence process. Some of them even regard their own
inconsistency as support of a wave-particle dualism or of
complementarity.
The reduced density matrix, derived from a complete description of
the state of a global system by tracing out the environment, is a
useful tool to describe the decoherence of a system under
consideration. It can be used to investigate in detail how certain
phase relations disappear from the system, thus transforming a pure
state for it into a "mixed state", for example. Given a realistic
environment,
this tells us (very successfully) which variables must appear
classically (lacking any superpositions), or in which situation we have
to expect almost sudden "quantum jumps" or other stochastic
"events" to occur. So it seems
that we neither need fundamental classical variables any more, nor any
indeterministic dynamics. However, this success, which led to
the early popularity of the decoherence concept, is partly based on an
ambiguity of the concept of the density matrix for a "mixed state".
(That for a pure state is unique, since it is equivalent to the state
vector or wave function.)
The reason for this ambiguity is that the density matrix is defined
only to describe the
correct probabilities for all measurements that can be performed on
this quantum system according to Born's statistical interpretation.
This means, first of all, that it cannot be used to derive this statistical
interpretation in terms of stochastic quantum jumps itself. (The
concept of "systems" is again an arbitrary tool - similar to the choice
of coordinates. A system need not even be local, for example, although
such an assumption appears usually natural.) A density matrix may
represent a pure state (if it is a projector onto a state vector), or a
mixed
state. The latter may then either represent a statistical ensemble of
pure states of the system with given probabilities, or form the reduced
density matrix with respect to the state of some global system that
includes all other systems with which the subsystem is entangled. In
the first case one could simply "select" a pure state from the
ensemble by an increase of information (just as for a classical
probability distribution),
in the second case one would first have to apply a stochastic
interpretation
to the global state in order to obtain an ensemble to select from. (A
mixed state for
the whole universe can always be interpreted as representing lacking
information about a pure state.) Although the difference
between entanglement and lacking information should
by now be well known to all quantum physicists, this confusion is still
responsible for many misunderstandings of decoherence. The
concept of decoherence did in fact arise from the insight that
entanglement
describes a fundamental nonlocality rather than mere statistical
correlations.
The complete
situation can be described conceptually only by means of the wave
function for the required global system. If either a measurement or an
uncontrollable interaction with the environment happens to some system,
the latter becomes entangled with whatever it interacted with. A pure
system
state (possibly one of the states diagonalizing its mixed density
matrix)
would thereby be transformed into a pure but entangled global state. An
initial superposition is thus "dislocalized": it is neither in the
system nor in the environment thereafter - something that can happen
only in a nonlocal world. It has always been known that the quantum
formalism is nonlocal - actually long before John Bell published his
arguments which demonstrated once and for all that this nonlocality
cannot be a statistical artifact due to incomplete knowledge about some
as yet hidden local variables. (Note that in the literature one finds
several popular but insufficient "measures of entanglement" which
measure only that entanglement which can somehow be used while they
neglect
precisely all the uncontrollable entanglement that leads to
decoherence.)
As I pointed out above, the reduced density matrix contains complete
information about everything that can be observed at a local system.
So, decoherence describes an irreversible transition of the "system"
state into an apparent ensemble
for all practical
purposes. This irreversibility is induced by the time arrow
characterizing the environment. If a measurement apparatus could
be treated as a (controllable) microscopic system, the measurement
would be reversable (it could be "undone"). However, a macroscopic
pointer must unavoidably interact with its uncontrollable environment
in each individual measurement. Therefore, it appears quite unmotivated
to invent any fundamental irreversible process, such as a collapse of
the
wave function, or to assume fundamental classical concepts to apply,
precisely where and when the observable or irreversible phenomena
occur. In particular, classical concepts (often defining the pointer
basis of a measurement device) emerge according to the
objective irreversible process of decoherence, while there remain
various possibilities to explain why we observe individual measurement
outcomes. If no new physics will be found to apply somewhere
between apparatus and observer, we may have to accept the "many worlds"
interpretation.
The essence of decoherence is thus given by the permanent increase of
entanglement between all systems. It describes a situation very far
from
equilibrium, and it leads to the permanent dislocalization of
superpositions. Its time arrow is formally analogous to the creation of
"irrelevant" statistical correlations by Boltzmann
collisions. Neglegting these classical correlations, for example by
using a
µ-space
distribution, leads to an increase of ensemble entropy. This
consequence remains
true as
well in quantum theory (in the sense of an "apparent ensemble entropy")
if one neglects
entanglement by relying on reduced density
matrices of subsystems. However, one should keep in mind that
entanglement represents individual
properties of the combined systems (such as
total angular momentum) - hence not just incomplete information.
Certain entangled states, such as Bell states, are even considered as
potential individual measurement outcomes in some experiments. In spite
of the analogy with
statistical correlations, the neglect of entanglement describes a
change of the physical states. The arrow of time defined by
the decoherence process requires a special initial condition for the
universal wave
function (namely: little or no initial entanglement).
Evidently, this must be a physical condition - it cannot just be a
condition for initial "human knowledge" or some kind of "information".
How decoherence can
solve
the measurement problem
Decoherence may be defined as the uncontrollable dislocalization of
quantum
mechanical
superpositions. It is an unavoidable
consequence of
the interaction
of all local systems with their environments
according to the
Schrödinger equation. Since the dislocalization
propagates in
general without
bounds, this concept of decoherence
does not depend on any precise
boundaries between subsystems. All
systems should
be
entangled
with their growing environments, and generically cannot
possess
quantum
states by their own. They may then formally be described by a
reduced
density matrix ρ
representing
a "mixed state", with a von Neumann entropy -trace(ρ
lnρ) that
in
general varies in time. This reduced density matrix
is
operationally
indistinguishable from that describing an ensemble
of states – as
though a subsystem state did exist but were only
incompletely
known.
For this reason, it is often erroneously
identified with an ensemble.
Since the
dynamical situation of increasing entanglement
applies in particular
to systems representing macroscopic outcomes of
quantum measurements
("pointer positions"), decoherence has
occasionally
been
claimed
to
explain
the
probabilistic nature of quantum mechanics (quantum
indeterminism).
However,
such a conclusion would evidently contradict
the determinism of
the thereby presumed unitary global dynamics.
(Note that
the claim – if correct – requires decoherence
to be
irreversible, as the
measurement could otherwise be
undone
or
"erased" – see Quantum
teleportation
and
other quantum
misnomers). Although the claim is
operationally
unassailable, it is
wrong. The very
concept of a
density
matrix is already based on local
operations (measurements) which
presume the probability
interpretation, while the
global quantum state
always remains pure and uniquely
determined under the exact
unitary dynamics.
Because
of this popular "naive" misinterpretation of decoherence, I
have
often emphasized that the latter does "not by itself
solve
the
measurement problem". This remark has in turn been quoted
to argue
that
decoherence be quite irrelevant for a solution of the
measurement
problem. The argument has mostly been used by
physicists
who insist on a
"conventional" solution: either by means of
a
stochastic novel dynamical law, or on the basis of an ensemble of
as
yet unknown
(hidden)
variables. Their hope can indeed not
be fulfilled by decoherence,
and it may forever remain wishful
thinking. In particular,
"epistemic" interpretations of the wave
function (as merely
representing incomplete knowledge) usually
remain silent about the
nature of what this knowledge is about in order
to avoid
contradictions.
A stochastic collapse of the wave function
as a real physical process,
on the other hand, would require
a fundamental
non-linear
modification of the Schrödinger
equation. (It would not make any
difference if this stochastic dynamics
were derived from the presumed
deterministic dynamics of some
hypothetical, but
in principle
unobservable variables.) Since, in Tegmark's words,
decoherence
"looks and
smells
like a
collapse", it is instructive first to ask in
what
sense such
collapse theories would solve the measurement problem
if
their prospective
non-linear
dynamics were ever
confirmed
empirically (for example, by studying systems that
are completely
shielded
against decoherence – a very difficult
task).
According to von Neumann's analysis of the
measurement process, a
collapse could indeed
solve the measurement
problem,
although many
physicists seem to prefer the questionable
formulation that
the Schrödinger equation is exact but applicable only between
the
"preparation" and "measurement" of a quantum state. The
wave
function would then only represent a tool to calculate
probabilities
for other (classical?) variables, whose values "enter
existence" only
in
measurements. However, it
appears absurd to assume
that the wave function exists only for
the purpose of experimental
physicists to make predictions for their
experiments. It would then
also
remain completely open how macroscopic objects,
including
preparation and
measurement devices themselves, could ever
be consistently described as
real
physical
systems consisting of
atoms.
It is well known that superpositions of two or more quantum
states
represent (new) individual
physical properties as long as the system
remains isolated, while they
seem to turn
into
statistical ensembles
when measured and hence subjected
to
decoherence. (As to my
knowledge, no "real", that is, irreversible,
measurement
has ever
been performed in the absense of decoherence.)
So what
would it mean if
appropriate
non-linear collapse terms in the dynamics were confirmed
to
exist?
These theories require that an assumed or prepared wave
function for
the different positions of a macroscopic pointer
(or any
other macroscopic variable) indeterministically evolves or
jumps
into
one of many possible
narrow wave
packet that may represent a real
pointer
position. These wave packets resemble
Schrödinger's coherent
states, which he once used to
describe
quasi-classical
oscillators,
and which he hoped to be representative for all
quasi-classical
objects (apparent particles, in particular). His hope
failed because
of the dynamical dispersion of the wave packet under
the
Schrödinger equation, while coherent states successfully
describe
time-dependent quasi-classical states of electromagnetic
field modes,
which interact very weakly with their environment. The
ensemble of all
possible
outcomes of the postulated collapse into such wave
packets of pointer positions, weighted by the empirical
Born probabilities,
would be described by essentially the same density
matrix as
that
arising from
decoherence. This collapse assumption would
mean that
no fundamental
classical
concepts are needed any more for an interpretation of
quantum
mechanics.
Since macroscopic pointer states are assumed to collapse
into wave
packets in their position
representation, there is no
eigenvalue-eigenfunction link problem that
might arise in epistemic
interpretations. General "observables" then
occur as a derivable
concept.
As an application, consider the particle track
arising
in a Wilson or bubble chamber, described by a succession
of
collapse events. All the
little droplets (or bubbles in a bubble
chamber) can be interpreted as
macroscopic
"pointers" (or documents).
They can themselves be observed without
being changed
by means of
"ideal measurements". In unitary description, the state
of
the
apparently observed
"particle" (its wave function) becomes
entangled with all these pointer
states in a way
that describes a
superposition of many
different tracks, each one consisting of a
number of droplets at
correlated positions. This entanglement
would
disappear according to the collapse, as it essentially
removes all
but one of the tracks (which are described by components of the
global wave
function, that approximately factorize with respect to the
particle,
sets of droplets, and their environment). The lowering of
(local)
entropy as a consequence of the collapse is often
underestimated. So
one assumes
that
the kinematical concept of a
wave function is complete,
and hence, for example, that there are no
particles
in reality. In
contrast,
many interpretations of
quantum theory, such as the Copenhagen
interpretation or those based
on Feynman
paths or Bohm
trajectories, are all
entertaining
the
prejudice that
classical concepts are fundamental
at some
level.
Decoherence
leads to the same local
density
matrix (for the combined system of
droplets and "particle", which
therefore seems to represent
an
ensemble of tracks. The correlations between the wave functions
of
different
droplets as forming tracks were
already known to Mott in
the
early days of quantum mechanics, but he did not yet take
into
account the subsequent and unavoidable process of decoherence of
the
droplet positions by their
environment. Mott did not see the need
to solve any measurement
problem, as he had accepted the probability
interpretation in terms of
classical variables. In a global unitary
quantum description, however,
there
is still just one
global
superposition of all
"potential" tracks consisting of droplets,
entangled with the
particle wave function and the environment: a
universal
Schrödinger cat. Since one does not
obtain an ensemble of potential states without a
collapse,
one cannot select
one of
its members by a mere increase of information. As such a
selection seems to occur,
it is this apparent increase
of
information that requires further
analysis.
Therefore, now add an observer of the Wilson
chamber to this picture.
According to the
Schrödinger equation,
he, too, would necessarily become part
of the entanglement with the
"particle", the device, and the
environment. Clearly, the phase
relations originating from the initial
superposition have now been
irreversibly dislocalized (become an
uncontrollable
property of
the
state of the whole universe). They can never be experienced any
more
by an observer
who is assumed to be local as a consequence of the
locality of
dynamics, but this dynamical locality also means that
certain
components
of the universal wave function become dynamically
autonomous by means
of
decoherence
(see Quantum
nonlocality vs. Einstein locality). The in this way arising
branches of the
global
wave function form entirely different "worlds", which may
contain
different states of
various
observers.
If
we intend to associate
consciousness
with states of local observers, we can do
this only separately to
their
thus
dynamically defined component states. The observed
quantum
indeterminism must then
be
attributed
to the indeterministic
history of these quasi-classical world branches
with their internal
observers. No
indeterminism is required for the global
quantum state.
This identification of observers with states existing
only in certain
branching components of the global wave function
is
the
only
novel element that
has to be added to the quantum formalism
for a solution of the
measurement problem.
Different observers of
the same measurementresult
living in the same "world"
are
consistently
correlated with one
another in a
similar way
as
the
positions of different droplets forming an individual track in
the
Wilson chamber. However, redefining the very concept of
reality
operationally as applying only to the subjectively observed
branch
would eliminate what we already knew for merely pragmatic
reasons
(Occam's razor applied to the facts rather than to the laws)!
The
picture of branching "worlds" perfectly
describes quantum
measurements – although in an unconventional way.
Decoherence
may thus be
regarded as a "collapse without a collapse".
(Note,
however, that decoherence occuring in quantum processes in the
brain
must be expected to lead
to further indeterministic branching even
after the information about a
measurement result has arrived at the
sensoric system in a
quasi-classical form.) Why should we object to
the
consequence that there must
be
myriads of (by us) unobserved quasi-classical worlds
according to the
Schrödinger equation, or why should we
insist
on the existence of fundamental classical objects that we seem
to observe, but that we don't
need for a consistent physical
description of
our
observations?
Collapse theories (formulated by means of
fundamental stochastic
quantum Langevin
equations)
would not only
have to postulate
the
indeterministic transition of
quantum states into certain component
states, but
also their relative
probabilities according to the Born
rules as part of this modified
dynamics.
While even without
a
collapse, the relevant components (or robust "branches" of the
wave
function) can be
dynamically justified by the
dislocalization of
superpositions (decoherence), as described above,
the
probabilities
themselves
can not. All
attempts to
derive empirical facts must be doomed to remain
circular
in some way. For example, Wojciech
Zurek's recent
attempts to
derive Born's rules by "going beyond decoherence" are
based on local
operations that presuppose
the
existence of subsystem states, which
he further assumes to "possess"
certain probabilities. Together they
would then define a formal state
of
(objective?) "information". In
this way, Zurek even claims to avoid
those many
Everett "worlds"
without postulating a collapse in what he calls his
"existential
interpretation" – evidently in contradiction to the
assumed
unitary dynamics. This approach seems to confirm Max
Tegmark's
alternative between Many
Worlds or Many
Words!
According to Graham, one may derive the
observed
relative frequencies of measurement outcomes
(their
statistical distribution) by merely
assuming that
our final
(the present)
branch of the universal wave function (in which "we"
happen to live)
does
not
have an
extremely small norm. Although the
choice of the norm is here
completely equivalent
to assuming the Born
probabilities for all individual branchings, it is
a
natural choice
for such a postulate, since the norm is
conserved under the
Schrödinger equation (just as phase space
is
conserved
in
classical theories, where it similarly serves as an
appropriate
probability measure). Nonetheless, most physicists
seem
to insist on a metaphysical (pre-Humean) concept of dynamical
probabilities, which
would explain
the observed
frequencies of measurement results in a "causal" manner.
However,
this assumption seems to represent a prejudice
resulting from our
causal classical experience.
There is now a wealth of
observed mesoscopic realizations of
"Schrödinger
cats", produced
according to a general Schrödinger equation. They
include
superpositions of different states of electromagnetic fields,
interference
between partial waves representing
biomolecules passing through
different slits of an appropriate device,
or
superpositions of
currents consisting of millions of electrons moving
collectively in
opposite directions. They can all be used to
demonstrate their
gradual decoherence by interaction with the
environment (in contrast
to previously assumed spontaneous quantum
jumps), while there is
so
far no
indication whatsoever for a
genuine collapse. However, complex
biological systems (living
beings)
can hardly ever be sufficently isolated, since they have to
permanently
get rid of entropy. Such systems depend essentially on
the arrow of
time that is manifest in the growing
correlations
(most
importantly in the form of
quantum entanglement, and hence
decoherence).
Only in a
Gedanken Experiment
may we
conceive of an isolated observer, who for some interval
of
time
interacts with an also isolated measurement device, or even
directly
with a microscopic system (by absorbing a single photon, for
example).
One may also imagine an observer who is himself passing
through an
interference device while being aware of the slit he
passes through.
What
would that mean according to a universal
Schrödinger
equation? Since the observer's internal state of
knowledge must be
entangled with the variables that he has observed,
or with his path of
which he is aware, the corresponding
"global"
superposition defines several
distinct and dynamically
independent states
for him as different factor states in all
these components. So he
would subjectively believe
to pass through one slit
only.
Could we confirm such a
prediction in principle? If we observed the
otherwise
isolated
observer from
outside, he should behave just as any
microscopic system – thus
allowing for recoherence.
Unfortunately, he would
thereby have to lose all his memory about
what he experienced. So can
we not ask him before recoherence occurs?
This would require him to
emit
information in some physical form,
thereby preventing recoherence and
interference. An observer in a
state that allows interference
could
never tell us which passage he
was aware of! This demonstrates that the
Everett
branching is
ultimately subjective,
although
we may always assume it to
happen objectively as
soon as decoherence has become irreversible for
all
practical
purposes. As this usually occurs in the apparatus of
measurement,
this
description justifies the pragmatic Copenhagen
interpretation
– albeit in a conceptually consistent manner and without
presuming classical terms.
(For more see "Roots and
Fruits of
Decoherence"
- in particular Sects. 3, 5 and
6.)
Quantum nonlocality vs. Einstein
locality
H. D. Zeh
Quantum theory is
kinematically nonlocal, while the theory
of
relativity
(including relativistic quantum field theory) requires
dynamical
locality ("Einstein locality"). How can these two
elements of the
theory (well
based
on experimental results) be
simultaneously meaningful and compatible?
How
can dynamical locality
even be defined
in terms of
kinematically
nonlocal
concepts?
Dynamical locality in
conventional terms means that there is no action at a distance: states
"here"
cannot
directly
influence states "there". Relativistically this has the
consequence that dynamical
effects
can only arise within the forward light cones of their
causes. However,
generic quantum states are "neither here nor there",
nor are they
simply composed of "states here
and states there"
(with a logical "and" that would in the
quantum formalism
be
represented as a direct product). Quantum systems at
different
places are usually entangled, and thus do
not possess any
states of their own. Therefore, quantum dynamics must
in
general
describe the dynamics of global
states. It may thus appear to be necessarily
nonlocal.
This discrepancy is often muddled by insisting
that
reality is made up of
local
events or phenomena only. However,
quantum entanglement does
not
merely represent
statistical
correlations that would represent
incomplete information about a
local reality. Individually
observable
quantities,
such as the total angular momentum of composed
systems,
or the binding energy of the He atom, can not be defined in terms of local
quantities. This
nonlocality has been directly
confirmed by the violation of
Bell's
inequalities or the existence
of
Greenberger-Horne-Zeilinger
relations. If there were
kinematically local concepts
completely describing reality, they
would indeed require some
superluminal "spooky action at a distance"
(in
Einstein's words).
Otherwise, however, such a picture may become
meaningless, and
nothing is teleported in
so-called quantum
teleportation
experiments, for example. Instead, one has to
carefully prepare
an
appropriate entangled
state that contains, among its components,
all states to be
possibly
teleported (or
their dynamical predecessors) already at
their final destination –
similar to the hedgehog's wife
in
the
Grimm brothers' story of Der Hase und der Igel (see Quantum teleportation and other quantum
misnomers).
These kinematical aspects characterize
quantum nonlocality. But what
about Einstein
locality in this
picture? Why does the change of a global quantum state
not allow
superluminal signals, for example? The concept of locality in
quantum theory requires
more than a formal
Hilbert space structure (relativistically as well
as
non-relativistically). It presumes a local Hilbert space basis (for example consisting
of
spatial
fields and/or
particles). Dynamical locality then means
that the Hamiltonian is a sum
over local terms, or
an
integral
over a
local Hamiltonian density in space, while all
dynamical
propagators for these local elements
must relativistically
obey the light cone structure.
This framework is most
successfully represented by quantum
field theory.
It may be
characterized by the following program:
(1) Define an underlying
set of local "classical" fields (including a
spatial metric)
on a
three-dimensional (or more general) manifold.
(2) Define quantum
states as wave functionals of these fields
(that is, nonlocal
superpositions of different spatial fields).
(3) Assume that the
Hamiltonian operator H (acting on wave functionals)
is defined as an
integral over a Hamiltonian density, written in terms
of these fields
at each space point.
(4) Using this Hamiltonian, write down a
time-dependent
Schrödinger equation for the wave functionals,
or, in order
to allow the inclusion of quantum gravity, a
Wheeler-DeWitt equation:
.
The
dynamics is then local (in the classical sense) for all
local
components, which, according to this construction, must span every nonlocal
state. This
concept defines the quantum version of Einstein locality.
(I have
here not discussed complications resulting from nonlocal
gauge
degrees
of freedom.)
The local (additive) form of the
Hamiltonian has an important
dynamical
consequence for nonlocal
states. If two distant systems and
are
entangled,
in
the
Schmidt decomposition, all matrix elements of H between
components
with different n must vanish, since the individual, local
terms of H
can only
act
on or . Such
"dislocalized
superpositions" arise
unavoidably by
means of decoherence, while their relocalization
("recoherence") would
require an improbable accident in a causal
universe (see The
Physical
Basis
of the Direction of Time). The factorizing Schmidt
components
thus describe dynamically autonomous "worlds", which must
contain
separate observers, and which permanently branch
by means of
measurement-like processes.
This
dynamical argument, based on
nothing else but the Schrödinger
equation with its local
Hamiltonian,
justifies Everett's collapse-free
interpretation of
quantum theory. (Note that the linearity of dynamics
by itself would
not be
sufficient for this
purpose, since it would not correctly describe
quantum measurements
and related phenomena.)
If, in the case of a Wheeler-DeWitt
equation, a WKB
approximation
(based on a
Born-Oppenheimer expansion
in terms of the
Planck mass) applies, orbit-like "wave tubes" in the
"superspace"
of
spatial geometries (the configuration space of general
relativity)
may
define quasi-classical spacetimes (such as solutions
of the Einstein
equations). The corresponding matter states obey a
derived
time-dependent
Schrödinger
equation with
respect to a
"WKB time" parameter along these quasi-classical orbits of
spatial
geometries (see
C.
Kiefer: Quantum Gravity, Cambridge UP, 2007). Wave
tubes on the
configuration space of geometry
are decohered from
one
another by the matter states (which are thereby regarded as an
environment
to
quantum geometry) according to the
Wheeler-DeWitt equation. This
decoherence along quasi-trajectories in
superspace may lead
to further quasi-classical fields, and possibly
other quasi-local
variables,
which are robust in
the sense that their
different values define dynamically autonomous
components
("branches"). Einstein locality then
holds up to remaining quantum
uncertainties of the spacetime metric
(resulting from the
non-vanishing widths of the wave packets in
superspace).
In "effective" (phenomenological) quantum field
theories, dynamical
locality
is often formulated
by means of a
condition of microcausality.
It requires that commutators between
field operators
at spatially different spacetime points vanish. This
condition is
partially
kinematic
(as it presumes a local reference
basis of quantum states), partially
dynamical (as it
uses the
Heisenberg picture for field operators), and partially
a matter of
definition (as it requires a decomposition of the field
operators in
terms of "particles and antiparticles", which may depend
on the effective vacuum, for
example).
The dynamical
consistency of this microcausality condition is highly
nontrivial. In
principle,
the properties of (anti-)commutators of
(effective) field operators at
different times
should be derivable from
those on an
arbitrary simultaneity, t = t', by
means of the given relativistic
dynamics (Hamiltonian). They cannot be
independently postulated for
all
times.
In his foundation of quantum field theory,
Steven Weinberg derived
microcausality and the locality of the
Hamiltonian from his cluster
decomposition principle.
This is a phenomenological constraint
to the S-matrix, which
requires
that "distant experiments give uncorrelated results".
However, such a
principle cannot form a fundamental element of the
quantum theory,
since (a) observable correlations may
exist or controllably be
prepared either as statistical
correlations or as entanglement
between distant systems,
and (b) the
concept of an S-matrix is
(approximately) applicable only to
sufficiently isolated
(microscopic) systems. Macroscopic systems never
cease to interact
uncontrollably with their environment – thus giving
rise
to
decoherence,
and hence to their classical behavior or the
appearance of "quantum
events" (see How
decoherence may solve the
measurement problem). Only the latter
justify the probability
interpretation of the S-matrix – even
for microscopic objects. So I
feel that instead of going beyond the
empirically founded effective
theories when searching for
mathematical consistency of hypothetical
theories
(in the hope for finding the
final universal theory), physicsts should
first analyze the physical
consistency and meaning
of effective field theories (see
also Chap. 6 of The Physical Basis
of the
Direction
of Time).
The original version of these essays can be found on the author's homepage