Charles Siegel over at Rigorous Trivialities suggested a NaNoWriMo for math bloggers: instead of writing a 50,000-word novel, just write a blog post every day. I have to admit I rather like the idea, so we’ll see if I can keep it up.

Continuing the previous post, what we want to do now is to think of restriction as a forgetful functor, since restricting a representation just corresponds to forgetting some of the data that defines it. Its left adjoint, if it exists, should be a construction of the “free -representation” associated to an -representation. Given a representation we therefore want to find a representation with the following universal property: any -intertwining operator for a -representation on naturally determines a unique -intertwining operator . In other words, we want to construct a functor such that

.

**Cosets**

To gain some insight into what’s going on, consider the special case that is the trivial representation on , hence for all . Given another -representation on a vector space , an -intertwining operator must satisfy , hence is identically zero or is constant on . What representation is universal for representations constant on ?

The answer is immediate: it’s the coset representation! The formal way to “pretend that elements of are the identity” is to consider two group elements to be equivalent if for some , and the equivalence classes of this relation are precisely the cosets , on which acts by left multiplication. Extend this action to a representation on in the obvious way. I now claim that this representation has the desired universal property, as follows:

If is not constant on , the only -intertwining operator is identically zero. Similarly, the only -intertwining operator is identically zero, since for all if and only if either is identically zero or is constant on . If is constant on , then the -intertwining operators are determined by the image of , and any choice of uniquely determines a -intertwining operator by sending a coset to and extending by linearity, and every intertwining operator arises in this way.

This coset construction generalizes as follows, following the Wikipedia article: given a representation on , define

where is a decomposition of into cosets of and is just a copy of . The representation is defined as follows: for , for unique and unique , and sends a vector to . This action is extended by linearity. Again, we verify the universal property, as follows:

Any -intertwining operator satisfies for all . It defines an operator by sending to , again extended by linearity, and we claim that this operator is -intertwining. This is equivalent to the claim that, if as above, then , or

and this property is satisfied if and only if , which is precisely the -intertwining condition. As before, any -intertwining operator arises in this way, although this is somewhat messy.

**The module perspective**

The generalization of the above argument in module theory gives what is called the **extension of scalars** functor. Given a ring and a subring , any left -module naturally acquires the structure of a left -module by restriction of scalars, which is the generalization of the restriction functor above if we set . Restriction defines a functor , and as above restriction has a left adjoint called extension. This adjunction follows from a more general adjunction, which is as follows:

Suppose is an -module and are -modules, and consider the set of all maps which are -linear in the first variable and -linear in the second variable. For fixed , the map is -linear, and this assignment of maps is itself -linear, so defines an -linear map . Moreover, all such maps arise this way. On the other hand, any map must also be -bilinear, hence it factors through the tensor product . This tensor product comes equipped with a natural action of on the right factor turning it into an -module, and then the extra condition that be -linear in is precisely the condition that the assignment be -linear. Again, all such maps arise this way. It follows that we have a natural equivalence

.

In other words, for any the functor (a free construction) is left adjoint to the functor (a forgetful functor).

Here’s why this is relevant: let . Our goal is to multiply elements of by elements of in a universal fashion (this defines an action of on ), hence to find a universal object for functions which are -linear in the first variable and -linear in the second; this is precisely the setup as above. Since , we are then led to the following conclusion.

**Proposition:** .

A little thought will reveal that this is essentially the same construction as above.

**Covering**

A combinatorial way to think about restriction and induction is as follows: given a subgroup of a subgroup , construct a category whose objects are irreducible representations of either or where there are arrows from a representation of to a representation of if . What Frobenius reciprocity tells us is that these arrows can be interpreted in two dual ways: either as the number of times appears in the induced representation of , or as the number of times appears in the restricted representation of . This turns the category into a generalization of a graded poset with two ranks where two objects may be related in more than one way.

This construction generalizes to any chain of groups. Next time we’ll see what this means for the symmetric groups. (There’s one more thing I need the induced representation for, too.)

on October 28, 2012 at 8:19 pm |MaBloWriMo is upon us « Annoying Precision[…] Three years ago I thought it would be fun to write a blog post every day of November. I’m not sure why I didn’t do this in November 2010 or 2011 because I’m pretty sure I learned a lot from doing it in 2009, so I’d like to do it again. The posts will probably be shorter this time. […]

on November 2, 2009 at 9:22 am |Young’s lattice « Annoying Precision[…] The induced representation […]