In an earlier post we introduced the Schrödinger picture of quantum mechanics, which can be summarized as follows: the state of a quantum system is described by a unit vector in some Hilbert space
(up to multiplication by a constant), and time evolution is given by
where is a self-adjoint operator on
called the Hamiltonian. Observables are given by other self-adjoint operators
, and at least in the case when
has discrete spectrum measurement can be described as follows: if
is a unit eigenvector of
with eigenvalue
, then
takes the value
upon measurement with probability
; moreover, the state vector
is projected onto
.
The Heisenberg picture is an alternate way of understanding time evolution which de-emphasizes the role of the state vector. Instead of transforming the state vector, we transform observables, and this point of view allows us to talk about time evolution (independent of measurement) without mentioning state vectors at all: we can work entirely with the algebra of bounded operators. This point of view is attractive because, among other things, once we isolate what properties we need this algebra to have we can abstract them to a more general setting such as that of von Neumann algebras.
In order to get a feel for the kind of observables people actually care about, we won’t study a finite toy model in this post: instead we’ll work through some classical (!) one-dimensional examples.
The Heisenberg picture
From our above description of measurement of an observable it follows that the expected value
can be given by the elegant formula
.
Since the characteristic function of a random variable completely determines it, it is possible in principle to replace knowledge of the state vector
with knowledge of the expectation
it induces on observables. In the theory of von Neumann algebras,
is referred to as a state for this reason.
Now, as evolves according to the Schrödinger equation, the expectation of an observable
evolves as
.
In the Schrödinger picture, we keep our algebra of observables invariant and modify the state
(or equivalently the expectation
), but in the Heisenberg picture, we keep the expectation
invariant and modify the algebra of observables
by the one-parameter group of inner automorphisms
.
This is quite an elegant way to think about time evolution: it tells us that we can delay thinking about states until we actually want to compute probabilities. At any point before then, we can think directly about how observables are changing, and consequently we don’t need to mention states at all to talk about properties of observables which don’t depend on initial conditions.
Now that we’ve conceived of time evolution as a one-parameter group of inner automorphisms of the algebra of observables, we can take its derivative, which is precisely the inner derivation
by a simple calculation. Recalling that, for a one-parameter group of automorphisms with associated derivation
, we have at least formally the Taylor expansion
it follows that
.
In particular, time evolution is completely determined by the commutator of the Hamiltonian with any observable. Note that is invariant under time evolution if and only if it commutes with the Hamiltonian, and consequently note that the infinitesimal generator of any one-parameter group of symmetries of the Hamiltonian gives a self-adjoint operator by Stone’s theorem, hence gives a conserved observable. This is how Noether’s theorem appears in quantum mechanics.
Particle on a line
The most basic example is the free particle on . Here the Hamiltonian is
where
is mass (a constant) and
is momentum (an observable), which is, up to normalization, the infinitesimal generator of translation. (Note that translation clearly acts by a one-parameter group of unitary transformations on
, so Stone’s theorem assures us we have a self-adjoint operator here.) This gives
which is, up to normalization, the ordinary Laplacian. Formally, the eigenvectors of with non-negative eigenvalues are
with corresponding eigenvalues
.
We have chosen these eigenvectors because they are also eigenvectors of the momentum operator with eigenvalues , which recovers the de Broglie relation
. (The other de Broglie relation,
, is already built into the Schrödinger equation since an eigenvector for the Hamiltonian with eigenvalue
is multiplied by
in time evolution.) Note that these “eigenvectors” do not actually exist in
. There is a formalism for dealing with this, but I don’t know it; in any case it can be dealt with.
Time evolution for a single eigenvector is given by
.
Of course multiplying the state vector by a nonzero complex number doesn’t affect anything we can measure about the state. It’s only when two or more eigenvectors are added together that the multiplication above manifests itself as (what appears to be) interference between waves. In fact, the above describes precisely a plane wave with angular frequency and wavenumber
.
Since is a scalar multiple of the square of
, it follows that
, so we recover conservation of momentum. The operator
(more precisely, multiplication by
), whose eigenvalues describe the position of our free particle, should not commute with either
or
since it should not be conserved. Instead, we have the canonical commutation relation
which gives (using the fact that is a derivation)
.
This gives
so we recover the familiar fact that is the velocity of a free particle.
Particle in a box
In order to work with some eigenvectors which actually exist in (hence which give rise to well-defined probability distributions), let’s restrict our formerly free particle to a box
. This is equivalent to requiring that the state vector vanish outside of this interval. If we further require that the state vector is continuous, then it must vanish at
and
. The eigenspace of the Hamiltonian with eigenvalue
is spanned by
, and in each of these eigenspaces we can find normalized eigenvectors
vanishing at and
whenever
. This is a complete list of eigenvectors of the Laplacian vanishing at
and
, and time evolution is given by
where
.
Now that we’ve restricted ourselves to a compact space, we finally get a system in which the energy eigenvalues are discrete, or quantized (from which quantum mechanics gets its name). Intuitively speaking, a particle in a box behaves like a wave, constantly bouncing against the walls; if it oscillates at the wrong frequency, destructive interference would eventually cause it to disappear. Only certain frequencies, dictated by the shape of the box, survive.
Note that the lowest energy eigenvalue is not zero; it occurs when . So unlike the classical case, a quantum particle in a box cannot have zero energy.
Particle in a potential
Next we consider the case when a particle on is subject to some potential. Potentials modify momentum, but the relationship between momentum and velocity should remain intact, so we still want
hence we still want . We can guarantee this whenever
has the form
where is a multiplication operator, since
, and in fact
is the desired potential. Potentials, being not translation-invariant in general, break conservation of momentum, and we instead have
. Now, let us suppose that
is a nice analytic function of . By induction on the relation
, using the fact that
is a derivation, we conclude that
, hence (at least formally)
.
From this it follows that
which recovers Newton’s second law , remembering that
classically.
We can deduce all this despite the fact that for arbitrary it is not at all obvious how to directly write down the eigenvectors or eigenvalues of the corresponding Hamiltonian.
A particularly simple case occurs when for some constant
. Then we compute that
, hence that
exactly as in the classical case. I do not know what the eigenvectors of the Hamiltonian look like in this case.
Another simple case of a particle in a potential is the quantum harmonic oscillator, where for some constant
, so that
.
This gives
which, together with the relation , implies that the iterated commutators of either
or
with
are (up to normalizing factors) periodic, alternating between some constant times
and some constant times
. In fact, we have
again exactly as in the classical case of, for example, a spring-mass system. Note that in this special case, conservation of energy is equivalent to the Pythagorean theorem!
The quantum harmonic oscillator is important in quantum field theory for reasons I don’t understand yet. Part of its basic importance to quantum mechanics can be understood as follows: for an arbitrary potential, expanding out in the neighborhood of a critical point, the linear term vanishes and so generically we get a quadratic term (plus lower order terms), hence to second order a harmonic oscillator.
There is a very elegant way to write down the eigenvectors of the Hamiltonian in this case using Dirac’s ladder method, but I think it would be best to leave such considerations until I understand the harmonic oscillator better.
Generalizations
One can take tensor products of any of the examples above to get examples in higher dimensions; for example, one can can consider particles in boxes in for any
. More generally one can consider particles on any Riemannian manifold, where the kinetic term of the Hamiltonian is taken to be a suitable multiple of the Laplacian. Particularly interesting cases include manifolds with large symmetry groups, since the eigenspaces of the Laplacian break up into irreducible representations of these groups.
Reblogged this on Observer.
[…] following. Let , and let be a self-adjoint element, to be thought of as a Hamiltonian. Using the Heisenberg picture, we can describe time evolution of states on with respect to the Hamiltonian as follows: after […]
[…] The Heisenberg picture of quantum mechanics В« Annoying Precision Jul 16, 2011 … In an earlier post we introduced the SchrГ¶dinger picture of quantum mechanics, … […]
[…] the previous post we described the Heisenberg picture of quantum mechanics, which can be phrased quite generally as […]
Regarding the eigenvectors of the homogeneous potential: this can be solved easily by working in the momentum representation since there we only need to solve a first-order ODE. One obtains the Airy functions in this way. (Of course, one can also solve position-space second order ODE using usual methods, e.g. by assuming functions can be expanded into series).
As for the importance of harmonic oscillator: even from classical theory we know that free fields (e.g. an EM field) are most naturally treated in the form of monochromatic plane waves (i.e. using Fourier analysis). Since the fields are free, the individual plane waves evolve according to a harmonic oscillator Hamiltonian. The Hamiltonian of such a free field \phi(x) can be written as H = 1/2 \int dx \pi(x)^2 + \phi(x) \Omega \phi(x). This can be regarded as a continuous version of harmonic oscillator where the frequencies of each mode are given as eigenstates of \Omega.
Also, since your mentioning pictures, the one that is most often used should also be mentioned: Dirac (or interaction) picture. While what you write is nice, in real life one rarely has time-independent Hamiltonian or is able to solve the eigenvector equation completely. The most natural description of many problems is by splitting the Hamiltonian H into H_0 and H_I such that H_0 is in some sense understood (e.g. we know the spectrum) whire H_I is in some sense small w.r.t. H_0. This then describes an interaction. Then one can propose a picture that evolves operator according to H_0 and states according to H_I (this is simplified for the case when H_I is independent of time). Or in other words, we discard the H_0 evolution that doesn’t really interest. This can again be seen to be equivalent to both S- and H-pictures.
Thanks for this comment. It seems I should learn more about classical field theory. And yes, I was planning on getting to the Dirac picture eventually.
Note that, generally, the Schrodinger picture is _more information_ than the Heisenberg picture, because it does include a particular (sometimes only projective) representation.
Actually, there is an interesting story to tell here, and I’m currently trying to develop it in some work in progress, so eventually I will be able to say more. The analogy is with whether in quantum mechanics there is the Hilbert space, or just its projectivization, or just …? For example, in the path-integral approach to QM/QFT, there is a partition function with absolute magnitude, but then the only “physical” things are expectation values, in which the partition function is normalized to unity. (Note that in statistical mechanics, which is closely analogous to pure-imaginary-time quantum mechanics, the logarithmic derivative with respect to temperature of the partition function is physical; in the analogy, temperature equals one on Planck’s constant.)
Anyway, what I really wanted to mention was the Schrodinger/Heisenberg level of the analogy. If you tell the Schrodinger picture to a mathematician, she’s likely to think that you’re telling her a vector space, which is to say an object V in the category Vect (or a category of topological vector spaces). Now, this is a misinterpretation, but only in the details. Then she’s likely to say, well, if I didn’t know the monoidal structure on Vect, then all I really have is a category with an object V. If some conditions are satisfied (they aren’t, but let’s pretend they are), then this data is _the same as_ and algebra A = End(V) — the functor Hom(V,-) goes from the category to A-mod, and if conditions are satisfies it’s an equivalence.
But of course we do know about the monoidal structure, or at least we do have a distinguished object in the ground field C. So on the one hand, we have Vect = C-mod and its distinguished object V — this is the Schrodinger picture. On the other hand, from just the category and V, we can build A-mod, where A = End_C(V). But now look at where C goes under the functor C-mod -> A-mod: it goes to (the dual to) V. So actually A-mod also has a distinguished object, but this time it’s the image of C, except that it’s still V, now thought of as an A-module. So to get back from Heisenberg picture to Schrodinger picture is just to know the distinguished A-module V (but there’s something interesting in how the data rearranges).
As I said, this is related to work-in-progress. You probably know about Atiyah-Segal style TQFT. I would like to suggest that this is like the Schrodinger picture: ultimately, a (fully extended) n-dimensional TQFT is the same data as an (n-1)-categorical thing V, which is an object in _the_ n-category that’s next on the list starting C, Vect, …. From this data, you can forget that you are working in _the_ n-category, and just remember that you are working in _an_ n-category, and then take End(V), which is now something like an E_n algebra. On the other hand, an E_n algebra is the same data as a “topological factorization algebra”, also (mis)called a “topological chiral algebra”, and discussed by folks including Beilinson-Drinfeld, Lurie, Francis, and Costello-Gwilliam. So the idea is that factorization algebra is the “Heisenberg” picture, and the E_n algebras that come from TQFT are precisely the ones that are “E_n Morita equivalent” to C-as-an-E_n-algebra. When n=1, this is the Heisenberg-verus-Schrodinger relationship above, where “Morita Equivalence” is clearly the correct term. When n=0, an E_0 algebra is a pointed vector space, “Morita equivalent” means “isomorphic as vector spaces after forgetting the chosen point”, and the relationship is the one between knowing the partition function or just knowing expectation values.
Of course, just as the Heisenberg picture is in many ways more “physical” than the Schrodinger picture in QM, I claim that factorization algebras are in many ways closer to real physics than are Atiyah-Segal TQFTs.
Qiaochu,
Given your interest in QM, you might want to watch (assuming you haven’t done so already) Prof. Sidney Coleman’s famous 1994 lecture (approx. 1 hour long):
Quantum Mechanics in your face
Thanks! I’ll check this out soon.