Feeds:
Posts

Finite noncommutative probability, the Born rule, and wave function collapse

The previous post on noncommutative probability was too long to leave much room for examples of random algebras. In this post we will describe all finite-dimensional random algebras with faithful states and all states on them. This will lead, in particular, to a derivation of the Born rule from statistical mechanics. We will then give a mathematical description of wave function collapse as taking a conditional expectation.

Noncommutative probability

The traditional mathematical axiomatization of probability, due to Kolmogorov, begins with a probability space $P$ and constructs random variables as certain functions $P \to \mathbb{R}$. But start doing any probability and it becomes clear that the space $P$ is de-emphasized as much as possible; the real focus of probability theory is on the algebra of random variables. It would be nice to have an approach to probability theory that reflects this.

Moreover, in the traditional approach, random variables necessarily commute. However, in quantum mechanics, the random variables are self-adjoint operators on a Hilbert space $H$, and these do not commute in general. For the purposes of doing quantum probability, it is therefore also natural to look for an approach to probability theory that begins with an algebra, not necessarily commutative, which encompasses both the classical and quantum cases.

Happily, noncommutative probability provides such an approach. Terence Tao’s notes on free probability develop a version of noncommutative probability approach geared towards applications to random matrices, but today I would like to take a more leisurely and somewhat scattered route geared towards getting a general feel for what this formalism is capable of talking about.

Banach algebras, the Gelfand representation, and the commutative Gelfand-Naimark theorem

Banach algebras abstract the properties of closed algebras of operators on Banach spaces. Many basic properties of such operators have elegant proofs in the framework of Banach algebras, and Banach algebras also naturally appear in areas of mathematics like harmonic analysis, where one writes down Banach algebras generalizing the group algebra to study topological groups.

Today we will develop some of the basic theory of Banach algebras, our goal being to discuss the Gelfand representation of a commutative Banach algebra and the fact that, for commutative C*-algebras, this representation is an isometric isomorphism. This implies in particular a spectral theorem for self-adjoint operators on a Hilbert space.

This material can be found in many sources; I am working from Dales, Aiena, Eschmeier, Laursen and Willis’ Introduction to Banach Algebras, Operators, and Harmonic Analysis.

Below all vector spaces are over $\mathbb{C}$, all algebras are unital, and all algebra homomorphisms preserve units unless otherwise stated. In the context of Banach algebras, the last two assumptions are not standard, but in practice non-unital Banach algebras are studied by adjoining units first, so we do not lose much generality.

Hilbert spaces (and dagger categories)

Hilbert spaces are a particularly nice class of Banach spaces. They axiomatize ideas from Euclidean geometry such as orthogonality, projection, and the Pythagorean theorem, but the ideas apply to many infinite-dimensional spaces of functions of interest to various branches of mathematics. Hilbert spaces are also fundamental to quantum mechanics, as vectors in Hilbert spaces (up to phase) describe (pure) states of quantum systems.

Today we’ll develop and discuss some of the basic theory of Hilbert spaces. As with the theory of Banach spaces, there are (at least) two types of morphisms we might want to talk about (unitary operators and bounded operators), and we will discuss an elegant formalism that allows us to talk about both. Things written by John Baez will be cited excessively.

Poisson algebras and the classical limit

In the previous post we described the Heisenberg picture of quantum mechanics, which can be phrased quite generally as follows: given a noncommutative algebra $A$ (the algebra of observables of some quantum system) and a Hamiltonian $H \in A$, we obtain a derivation $[-, H]$, which is (up to some scalar multiple) the infinitesimal generator of time evolution. This is a natural and general way to start with an algebra and an energy function and get a notion of time evolution which automatically satisfies conservation of energy.

However, if $A$ is commutative, all commutators are trivial, and yet classical mechanics somehow takes a Hamiltonian $H \in A$ and produces a notion of time evolution. How does that work? It turns out that for algebras of observables $A$ of a classical system, we can think of $A$ as the classical limit $\hbar \to 0$ of a family $A_{\hbar}$ of noncommutative algebras. While $A$ is commutative, the noncommutativity of the family $A_{\hbar}$ equips $A$ with the extra structure of a Poisson bracket, and it is this Poisson bracket which allows us to describe time evolution.

Today we’ll describe one way to formalize the notion of taking the classical limit using the deformation theory of algebras. We’ll see how Poisson brackets pop out along the way, as well as the relevance of the lower Hochschild cohomology groups.

Update

I put up a post over at the StackOverflow blog describing a little of what I’ve been up to this summer.

Curiously enough, the Zipf distribution which shows up in that post is the same as the zeta distribution that shows up when trying to motivate the definition of the Riemann zeta function. I’m sure there is a conceptual explanation of this connection somewhere, probably coming from statistical mechanics, but I don’t know it. I suppose the approximate scale invariance of the zeta distribution is relevant to its appearance in many real-life statistics, as described in Terence Tao’s blog post on the subject here.

The Heisenberg picture of quantum mechanics

In an earlier post we introduced the SchrÃ¶dinger picture of quantum mechanics, which can be summarized as follows: the state of a quantum system is described by a unit vector $\psi$ in some Hilbert space $L^2(X)$ (up to multiplication by a constant), and time evolution is given by

$\displaystyle \psi \mapsto e^{ \frac{H}{i \hbar} t} \psi$

where $H$ is a self-adjoint operator on $L^2(X)$ called the Hamiltonian. Observables are given by other self-adjoint operators $F$, and at least in the case when $F$ has discrete spectrum measurement can be described as follows: if $\psi_k$ is a unit eigenvector of $F$ with eigenvalue $F_k$, then $F$ takes the value $F_k$ upon measurement with probability $\left| \langle \psi, \psi_k \rangle \right|^2$; moreover, the state vector $\psi$ is projected onto $\psi_k$.

The Heisenberg picture is an alternate way of understanding time evolution which de-emphasizes the role of the state vector. Instead of transforming the state vector, we transform observables, and this point of view allows us to talk about time evolution (independent of measurement) without mentioning state vectors at all: we can work entirely with the algebra of bounded operators. This point of view is attractive because, among other things, once we isolate what properties we need this algebra to have we can abstract them to a more general setting such as that of von Neumann algebras.

In order to get a feel for the kind of observables people actually care about, we won’t study a finite toy model in this post: instead we’ll work through some classical (!) one-dimensional examples.