Feeds:
Posts

## Constructing Poisson algebras

(Commutative) Poisson algebras are clearly very interesting, so it would be nice to have ways of constructing examples. We know that $k[x, p]$ is a Poisson algebra with bracket uniquely defined by $\{ x, p \} = 1$; this describes a classical particle in one dimension, and is the classical limit of a quantum particle in one dimension (essentially the Weyl algebra).

More generally, if $A, B$ are Poisson algebras, then the tensor product $A \otimes_k B$ can be given a Poisson bracket given by extending

$\displaystyle \{ a_1 \otimes b_1, a_2 \otimes b_2 \} = \{ a_1, a_2 \} \otimes b_1 b_2 + a_2 a_1 \otimes \{ b_1, b_2 \}$

linearly. At least when $A, B$ are unital, this Poisson algebra is the universal Poisson algebra with Poisson maps from $A, B$ such that the images of elements of $A$ Poisson-commute with the images of elements of $B$. In particular, it follows that $k[x_1, p_1, ..., x_n, p_n]$ is a Poisson algebra with the bracket

$\{ x_i, x_j \} = \{ p_i, p_j \} = 0, \{ x_i, p_j \} = \delta_{ij}$.

This describes a classical particle in $n$ dimensions, or $n$ different classical particles in one dimension, and it is the classical limit of a quantum particle in $n$ dimensions, or $n$ different quantum particles in one dimension.

Today we’ll discuss the question of how one might go about constructing Poisson brackets more generally.

Alternating biderivations

Recall that an alternating biderivation $\{ -, - \}$ on an algebra $A$ is an alternating bilinear map which is a derivation in each variable. Furthermore, recall that if $D_1, D_2$ are derivations on $A$, then $\{ a, b \} = D_1(a) D_2(b) - D_1(b) D_2(a)$ is an alternating biderivation. Thus we get a natural alternating map $\text{Der}(A, A) \times \text{Der}(A, A) \to \text{ABDer}(A, A)$ which factors through the exterior square

$\displaystyle \Lambda^2(\text{Der}(A, A)) \to \text{ABDer}$.

This map is injective since $D_1(a) D_2(b) - D_1(b) D_2(a) = 0$ for all $a, b \in A$ if and only if $D_1, D_2$ are scalar multiples of each other. A natural question is when this map is also surjective.

Proposition: The above map is surjective when $A = k[x_1, ... x_n]$ is a polynomial algebra.

Proof. Any alternating biderivation $\{ -, - \}$ on $A$ is determined by $\{ x_i, x_j \} = c_{ij}, i < j$. Furthermore,

$\displaystyle \{ a, b \} = \frac{\partial a}{\partial x_i} \frac{\partial b}{\partial x_j} - \frac{\partial b}{\partial x_i} \frac{\partial a}{\partial x_j}$

is an alternating biderivation such that $\{ x_i, x_j \} = 1$ and such that $\{ x_k, - \} = 0$ for all $k \neq i, j$. It follows that for any choice of elements $c_{ij} \in A, i < j$ there exists a unique alternating biderivation satisfying $\{ x_i, x_j \} = c_{ij}$ given by

$\displaystyle \{ a, b \} = \sum_{i < j} c_{ij} \left( \frac{\partial a}{\partial x_i} \frac{\partial b}{\partial x_j} - \frac{\partial b}{\partial x_i} \frac{\partial a}{\partial x_j} \right)$.

This is a very special case of the Hochschild-Kostant-Rosenberg theorem, at least once you know that the Harrison cohomology $H^2_s(A, A)$ of a polynomial algebra vanishes. But this isn’t difficult to see: any commutative first-order deformation is necessarily itself a polynomial algebra because no relations can exist between any lifts of the generators.

The following alternate perspective on the above result may be enlightening. Note that any derivation $D : A \to A$ factors through the $A$-module generated by formal symbols of the form $da, a \in A$ subject to the following relations:

1. $dc = 0$ whenever $c \in k$.
2. $d(a + b) = d(a) + d(b)$.
3. $d(ab) = a d(b) + d(a) b$.

This $A$-module is denoted $\Omega^1_{A/k}$ and known as the space of Kähler differentials. Since all of the above axioms are satisfied by the exterior derivative of a function on a smooth manifold, the intuition here is that $\Omega^1_{A/k}$ is analogous to the space of differential $1$-forms on $\text{Spec } A$. This fits in nicely with the intuition that derivations are analogous to vector fields on $\text{Spec } A$, since by definition a derivation $A \to A$ is an $A$-module morphism $\Omega^1_{A/k} \to A$, so there is a natural pairing between the two.

But now it’s clear that an alternating biderivation on $A$ is nothing more than an $A$-module morphism

$\displaystyle \{ -, - \} : \Lambda^2(\Omega^1_{A/k}) \to A$.

Thus alternating biderivations are dual to $2$-forms. Intuitively, they are therefore bivector fields on $\text{Spec } A$. If $\Omega^1_{A/k}$, as an $A$-module, behaves sufficiently similar to a finite-dimensional vector space over a field, then it follows that the dual of its exterior square ought to be isomorphic to the exterior square of its dual, which is what we showed above when $A$ is a polynomial algebra. In this case, $\Omega^1_{A/k}$ is in fact a free $A$-module on generators $dx_1, dx_2, ... dx_n$, which is what makes the above argument work abstractly. More generally I think the above argument goes through whenever $\Omega^1_{A/k}$ is finitely-generated and projective.

(Above I am glossing over the distinction between the exterior square and the space of alternating $2$-tensors. I haven’t yet made up my mind about when this distinction is worth making.)

The Jacobi identity

Now that we have a reasonably good grasp of alternating biderivations, what can we say about the ones that satisfy the Jacobi identity (and therefore are Poisson brackets)?

Proposition: An alternating biderivation on an algebra $A$ satisfies the Jacobi identity if and only if the Jacobi identity is satisfied when a set of generators of $A$ is plugged in.

Proof. It suffices to observe that the Jacobiator

$\displaystyle \{ a, \{ b, c \} \} + \{ b, \{ c, a \} \} + \{ c, \{ a, b \} \}$

is a triderivation: it is trilinear and satisfies the Leibniz rule in each of its components separately. (This is a straightforward computation using the Leibniz rule for $\{ a, b \}$.) Thus it is determined linearly by its values on a set of generators of $A$.

We record the following two immediate corollaries.

First, if $V$ is a vector space equipped with an alternating bilinear form $\omega : V \times V \to k$, then $\omega$ extends to a Poisson bracket on the symmetric algebra $S(V)$. (The Jacobi identity is clearly satisfied on generators since $\{ v, w \} \in k$ is a scalar for any $v, w \in V$.) These are precisely the polynomial Poisson algebras for which the Poisson bracket is graded with degree $-2$. In particular, if $V$ has a basis $x_1, ... x_n, p_1, ... p_n$ such that $\omega$ is given by

$\displaystyle \omega(x_i, x_j) = \omega(p_i, p_j) = 0, \omega(x_i, p_j) = \delta_{ij}$

then we get precisely the algebra of observables on $n$ classical particles as described earlier.

Second, if $\mathfrak{g}$ is a Lie algebra, then the Lie bracket $[-, -]$ extends to a Poisson bracket on the symmetric algebra $S(\mathfrak{g})$. These are precisely the polynomial Poisson algebras for which the Poisson bracket is graded with degree $-1$.

In both of these cases we can explicitly find a deformation quantization: that is, we can identify a formal deformation from which we get the above Poisson algebras as classical limits. This will be expanded on in later posts.

### One Response

1. Great post. Can you recommend any references for this material?