Feeds:
Posts

Note: as usual, I will be playing free and loose with category theory in this post. Apologies to those who know better.

One way to define a subgroup $H$ of a group $G$ is as the image of a homomorphism into $G$. Given the inclusion map $H \to G$, the functor $\text{Hom}(G, \text{End}(V))$ in the category of groups acts contravariantly to give a map $\text{Res}_H^G : \text{Rep}(G) \to \text{Rep}(H)$ called restriction. More concretely, the restricted representation $\rho|_H$ of a representation $\rho$ is defined simply by $\rho|_H(h) = \rho(h)$. Hence there is a functorial way to pass from a representation of a group $G$ to one of a subgroup $H$.

It is not obvious, however, whether there is a functorial way to pass from a representation of $H$ back to one of $G$. There is such a construction, which goes by the name of induction, and we will need it later. Today we’ll discuss the general category-theoretic context in which induction is understood, where it is called an adjoint functor. For more about adjoints, see (in no particular order) posts at Concrete Nonsense, the Unapologetic Mathematician, and Topological Musings.

Adjoints are the next best thing to inverses

Are putting a ball into a bag and taking a ball out of a bag inverse operations? It might seem so at first glance. Consider, however, that starting with a bag with $n$ balls in it, there are $n$ ways to take a ball out and then to put another one in, whereas there are $n+1$ ways to put a ball in and then take a ball out. So from a combinatorial perspective, the two are not inverses. What they are, however, is in a certain precise sense adjoints (or transposes) of each other.

Here’s one way to make that statement precise. Suppose we are given a collection of bags such that there are $b_n$ bags with $n$ balls. Since bags are unordered, we should associate this collection with its exponential generating function $B(x) = \sum_{n \ge 0} b_n \frac{x^n}{n!}$. After taking a ball out of each bag, a bag that had $n$ balls in it will now have $n-1$ balls in it and be in one of $n$ states depending on which ball was taken out, so we pass to the derivative

$\displaystyle \frac{d}{dx} B(x) = \sum_{n \ge 0} b_{n+1} \frac{x^n}{n!}$.

When putting a ball back in, we simply add a ball to each bag, so we multiply by $x$ to get

$\displaystyle x B(x) = \sum_{n \ge 1} n b_{n-1} \frac{x^n}{n!}$.

These operations do not commute, so they cannot be inverses of one another. However, $\mathbb{Q}[x]$ has the following inner product: the powers $\frac{d}{dx}^k$ of the derivative span the dual space, and the dual basis associated to the basis $\{ \frac{x^n}{n!} \}$ is precisely the basis $\{ \frac{d}{dx}^n \}$. Sending $x$ to $D$ and extending by multiplicativity gives the inner product

$\left< x^i, x^j \right> = \begin{cases} 0 \text{ if } i \neq j \\ i! \text{ if } i = j \end{cases}$

and in this inner product we have the adjoint relation

$\left< x P(x), Q(x) \right> = \left< P(x), \frac{d}{dx} Q(x) \right>$.

Note that we are regarding $x$ simultaneously as an element of $\mathbb{Q}[x]$ and as a multiplication operator. The algebra spanned by $x, \frac{d}{dx}$, which is a subalgebra of the full endomorphism algebra of $\mathbb{Q}[x]$, is the Weyl algebra in one variable. This will become relevant later.

If you don’t like that example, then let $A$ be the adjacency matrix of a graph. Then $A^T$ is the adjacency matrix of the opposite graph, where every edge is reversed. ($A^{-1}$, on the other hand, almost never has positive integer coefficients.)

There are many common examples of pairs of constructions that take algebraic objects to other algebraic objects which are not inverses, but which feel inverse-like. For example:

1. Let $K$ be a perfect field and let $L/K$ be a Galois extension with Galois group $G = \text{Gal}(L/K)$. There is a map from intermediate fields $L \ge F \ge K$ to the subgroup $\text{Gal}(L/F)$ of $G$ and another map from subgroups $H \le G$ to their fixed fields $L^H$. The fundamental theorem of Galois theory asserts that these maps are not inverses unless $F/K$ is Galois and $H$ is normal, respectively.
2. Let $k$ be an algebraically closed field. There is a map from subsets $J$ of $k[x_1, ... x_n]$ to the set of points $Z(J)$ in $k^n$ such that every element of the susbset vanishes and another map from subsets $V$ of $k^n$ to the set of polynomials $I(V)$ that vanish on that subset. The strong Nullstellensatz asserts that these maps are not inverses unless $J$ is a radical ideal and $V$ is an affine variety, respectively.
3. Let $G$ be a group. There is a map sending $G$ to its abelianization $G/[G, G]$. We should interpret this map as a functor $\text{Grp} \to \text{Ab}$ from groups to abelian groups to figure out what to pair it with: the inclusion functor $\text{Ab} \to \text{Grp}$. These maps are not inverses unless $G$ is abelian.
4. Let $G$ be a group. There is a map $\text{Grp} \to \text{Set}$ sending a group to its underlying set and another map $\text{Set} \to \text{Grp}$ sending a set to the free group on it. These maps are never (naturally) inverses.

The first two examples are two of my favorites, so we’ll talk about them first. In the Galois case, the set of intermediate fields $L \ge F \ge K$ has a natural poset structure by inclusion, and so does the set of subgroups of $G$. These poset structures, however, are “opposite” when related by the functions defined above – if the subgroup is large, then its fixed field is small, and if $F$ is small, then $\text{Gal}(L/F)$ is large. How can we make this precise?

Well, we want to examine to what extent these functions are inverses of each other, so let’s compose them. It’s not hard to see that $\text{Gal}(L/L^H)$ is at least as large as $H$, and it’s also not hard to see that $L^{\text{Gal}(L/F)}$ is at least as large as $F$. In other words, $H \subseteq \text{Gal}(L/L^H)$ and $F \subseteq L^{\text{Gal}(L/F)}$. Let’s define the following poset structures on the subgroups of $G$ and the intermediate fields $L \ge K \ge F$; we’ll say that $H_1 \ge H_2$ if $H_2$ is a subgroup of $H_1$ and we’ll say that $K_1 \ge K_2$ if $K_2$ is a subfield of $K_1$. Call these posets $\text{Sub}(G)$ and $\text{Int}(L/F)$ respectively. Now, we know that these poset structures can be thought of as categories where $\text{Hom}(a, b)$ is a single arrow if $a \ge b$ and no arrow otherwise, and recall that for a category $C$ the category $C^{op}$ is obtained by reversing all arrows. Now I claim that both of the above observations are subsumed in the single observation that

$\displaystyle \text{Hom}_{\text{Sub}(G)}(\text{Gal}(L/F), H) = \text{Hom}_{\text{Int}(L/F)^{op}}(F, L^H)$.

When $F = L^H$ we recover the first observation and when $H = \text{Gal}(L/F)$ we recover the second. The above may look intimidating, but when unpacked it says something very trivial: $H$ fixes every element of $F$ if and only if every element of $F$ is fixed by $H$.

Now I claim that the functions $F \mapsto \text{Gal}(L/F)$ and $H \mapsto L^H$ are order-preserving functions between $\text{Int}(L/F)^{op}$ and $\text{Sub}(G)$, and the order-preserving functions between posets are functors. Category theorists say that these two maps are an adjoint pair (with the equality replaced by a natural isomorphism) relating $\text{Sub}(G)$ and $\text{Int}(L/F)^{op}$. $\text{Gal}(L/F)$ is the left adjoint and $L^H$ is the right adjoint. A pair of adjoint functors between posets is called a monotone Galois connection, and this example is the motivation for that terminology.

But we didn’t relate our posets by an adjunction – we related one poset to the opposite of the other. This relationship is called an antitone Galois connection because the maps involved are order-reversing (read: contravariant). And you can check that for ideals and varieties the exact same relationship holds.

The motivation for the term “adjoint” is that the relationship $\text{Hom}(FX, Y) \simeq \text{Hom}(X, GY)$ looks an awful lot like the relationship between a pair of adjoint linear transformations. In fact, as we’ve seen, in at least one situation it makes sense to think of $\text{Hom}$ as a categorification of the inner product. But there’s much deeper stuff going on here: Todd Trimble‘s post on adjoints discusses the relationship between adjoints and the Yoneda lemma, but I haven’t really digested this point yet.

Free constructions

A standard set of examples of adjoint functors are free and forgetful functors. Roughly speaking, if a category $D$ consists of objects in a category $C$ “with extra structure,” then there is a functor $D \to C$ which “forgets” this structure. In nice cases this functor has a left adjoint which is the “free $D$-object” associated to a $C$-object. In typical examples we’ll take $C = \text{Set}$ and $D$ to be an “algebra” (in the sense of universal algebra), i.e. a set equipped with operations satisfying some axioms.

As mentioned above, the standard example here is when $D = \text{Grp}$. The free group on a set $S$ is typically defined as the group generated by the elements of $S$ with no extra relations besides those imposed by the group axioms. More precisely if $S = \{ s_1, ... s_n \}$ then $F(S)$ is the set of words on the alphabet $s_1, ... s_n, s_1^{-1}, ... s_n^{-1}$ with group operation concatenation and the relations $s_1 s_1^{-1} = s_1^{-1} s_1 = ... = e$ imposed by the group axioms. Let $U(G)$ denote the underlying set of the group $G$, and let’s verify the adjoint relation

$\displaystyle \text{Hom}_{\text{Grp}}(F(S), G) \simeq \text{Hom}_{\text{Set}}(S, U(G))$.

But this is straightforward. A function $S \to U(G)$ is just a function which assigns a group element $g_i \in G$ to every element $s_i \in S$. And the free group has the property that, since $F(S)$ is generated by the elements $s_i$, replacing them with elements of another group $g_i$ gives a valid group homomorphism sending $s_i^{-1}$ to $g_i^{-1}$ as expected, and moreover every group homomorphism arises in this way. This is what it really means to be a free group: homomorphisms out of a free object should be as unconstrained as possible, which is essentially the universal property of the free group construction.

Another nice example is when $C = \text{Vect}, D = \text{Alg}$ (where we fix the base field $K$). One thinks of an algebra as a vector space equipped with a bilinear map $V \times V \to K$ satisfying associativity. The forgetful functor $U : \text{Alg} \to \text{Vect}$ just ignores this map, so what does its adjoint look like? Well, if we want to imitate the above construction, then what we should do is to define formal multiplication on $V$ subject to no constraint other than associativity. This will be easiest to do given a basis $v_1, ... v_n$. First we need to introduce an identity $1$. To define a product of two vectors we’ll introduce formal symbols $v_i \otimes v_j, 1 \le i, j \le n$ and require bilinearity. This defines the tensor product $V \otimes V$, which is universal for bilinear maps $V \times V \to K$. But since we also want to define products of three or more vectors we need to introduce more formal symbols $v_i \otimes v_j \otimes v_k$, which is done by considering the higher tensor powers $V^{\otimes k} \simeq V \otimes V^{\otimes (k-1)}$. The tensor product is not, strictly speaking, associative, but there is a natural association given by various maps such as $(v_i \otimes v_j) \otimes v_k \to v_i \otimes (v_j \otimes v_k)$. Since there are no relations imposed between these vectors, the free algebra $F(V)$ is a direct sum of each of these tensor powers, i.e.

$\displaystyle F(V) \simeq \bigoplus_{k=0}^{\infty} V^{\otimes k}$

where $V^{\otimes 0} = K$. This defines the tensor algebra on $V$, and it has the following universal property: any linear map $V \to U(A)$ where $A$ is an algebra extends to a unique algebra homomorphism $F(V) \to A$ in the obvious way. In other words, we have the adjoint relation

$\text{Hom}_{\text{Alg}}(F(V), A) \simeq \text{Hom}_{\text{Vect}}(V, U(A))$.

In the next post we’ll apply these ideas to constructing the induced representation.

### 4 Responses

1. It’s a little confusing (to me) what is meant by “Sending x to D and extending by multiplicativity…”

From context, I infer that you’re fixing the basis above and taking either the first or second component to its associated dual, then using evaluation to give an inner product. But I’m not all to clear on how the preceding sentence describes that… what is D?

• D means $\frac{d}{dx}$, and I mean that the inner product is defined by sending $x^n$ to $\frac{d^n}{dx^n}$.

2. […] subsets of and the lattice of subsets of . As we observed, this means equivalently that they are adjoint functors between the corresponding categories. restrict to a Galois connection between the smaller lattices […]

3. on November 18, 2009 at 2:44 am | Reply noncommutativealgebraicgeometry

The adjoint pair of tensor algebra and forgetful functors is very useful to prove representability of (noncommutative)grassmannian as a presheaf Gr(M,V),where M is an object of finite type,i.e. f.g.projective module.