Feeds:
Posts

## The adjoint functor theorem for posets

Recently in Measure Theory we needed the following lemma.

Lemma: Let $g : \mathbb{R} \to \mathbb{R}$ be non-constant, right-continuous and non-decreasing, and let $I = (g(-\infty), g(\infty))$. Define $f : I \to \mathbb{R}$ by $f(x) = \text{inf} \{ y \in \mathbb{R} : x \le g(y) \}$. Then $f$ is left-continuous and non-decreasing. Moreover, for $x \in I$ and $y \in \mathbb{R}$,

$f(x) \le y \Leftrightarrow x \le g(y)$.

If you’re categorically minded, this last condition should remind you of the definition of a pair of adjoint functors. In fact it is possible to interpret the above lemma this way; it is a special case of the adjoint functor theorem for posets. Today I’d like to briefly explain this. (And who said category theory isn’t useful in analysis?)

The usual caveats regarding someone who’s never studied category talking about it apply. I welcome any corrections.

## Mathematics in real life II

Another small example I noticed awhile ago and forgot to write up.

Prime numbers, as one of the most fundamental concepts in mathematics, have a way of turning up in unexpected places. For example, the life cycles of some cicadas are either $13$ or $17$ years. It’s thought that this is a response to predation by predators with shorter life cycles; if your life cycle is prime, a predator with any shorter life cycle can’t reliably predate upon you.

A month or so ago I noticed a similar effect happening in the card game BS. In BS, some number of players (usually about four) are dealt the same number of cards from a standard deck without jokers. Beginning with one fixed card, such as the two of clubs, players take turns placing some number of cards face-down in the center. The catch is that the players must claim that they are placing down some number of a specific card; Player 1 must claim that they are placing down twos, Player 2 must claim that they are placing down threes, and so forth until we get to kings and start over. Any time cards are played, another player can accuse the current player of lying. If the accusation is right, the lying player must pick up the pile in the center. If it is wrong, the accusing player must pick up the pile in the center. The goal is to get rid of all of one’s cards.

I’ve been playing this game for years, but I didn’t notice until quite recently that the reason the game terminates in practice is that $13$, the number of types of cards in a standard deck, is prime. If, for example, we stopped playing with aces and only used $12$ types of cards, then a game with $4 | 12$ people need not terminate. Consider a game in which Player 1 has only cards $2, 6, 10$, Player 2 has only cards $3, 7, J$, Player 3 has only cards $4, 8, Q$, and Player 4 has only cards $5, 9, K$, and suppose that Player 1 has to play threes at some point in the game. Then no player can get rid of their cards without lying; since the number of players divides the number of card types, every player will always be asked to play a card they don’t have. Once every player is aware of this, every player can call out every other player’s lies, and it will become impossible to end the game reasonably.

More generally, such situations can occur if $13$ is replaced by a composite number $n$ such that the number of players is at least the smallest prime factor of $n$. This is because people who get rid of their cards will leave the game until the number of players is equal to the smallest prime factor of $n$, at which point the game may stall. But because $13$ is prime, any game played with less than $13$ people has the property that each player will eventually be asked to play a card that they have.

## Ramsey theory and Fermat’s Last Theorem

In the first few lectures of Graph Theory, the lecturer (Paul Russell) presented a cute application of Ramsey theory to Fermat’s Last Theorem. It makes a great introduction to the general process of casting a problem in one branch of mathematics as a problem in another and is the perfect size for a blog post, so I thought I’d talk about it.

The setup is as follows. One naive way to go about proving the nonexistence of nontrivial integer solutions to $x^n + y^n = z^n, n > 2$ (that is, solutions such that $x, y, z$ are not equal to zero) is using modular arithmetic; that is, one might hope that for every $n > 2$ it might be possible to find a modulus $m$ such that the equation has no nontrivial solution $\bmod m$. To simplify matters, we’ll assume that $x, y, z$ are relatively prime to $m$, or else there is some subtlety in the definition of “nontrivial” (e.g. we might have $x, y, z$ not divisible by $m$ but $x^n \equiv 0 \bmod m$.) Note that it might be the case that $m$ is not relatively prime to a particular nontrivial solution in the integers, but if we can prove non-existence of nontrivial solutions for infinitely many $m$ (in particular, such that any integer is relatively prime to at least one such $m$) then we can conclude that no nontrivial integer solutions exist.

By the Chinese Remainder Theorem, this is possible if and only if it is possible with $m$ a prime power, say $m = p^k$. If $p$ is relatively prime to $n$, this is possible if and only if it is possible with $m = p$. This is because given a nontrivial solution $\bmod p$ we can use Hensel’s lemma to lift it to a nontrivial solution $\bmod p^k$ for any $k$ (and even to $\mathbb{Z}_p$), and the converse is obvious. (Again to simplify matters, we’ll ignore the finitely many primes that divide $n$.) So we are led to the following question.

For a fixed positive integer $n > 2$ do there exist infinitely many primes $p$ relatively prime to $n$ such that $x^n + y^n \equiv z^n \bmod p$ has no nontrivial solutions?

As it turns out, the answer is no. In 1916 Schur found a clever way to prove this by proving the following theorem.

Theorem: For every positive integer $k$ there exists a positive integer $m$ such that if $\{ 1, 2, ... m \}$ is partitioned into $k$ disjoint subsets $A_1, ... A_k$, then there exists $i$ such that there exist $a, b, c \in A_i$ with $a + b = c$. In other words, the Schur number $S(k) = m$ exists. (Note that I am using a convention which is off by $1$.)

If we let $p$ be a prime greater than $S(n)$ and let the $A_i$ be the cosets of the subgroup of $n^{th}$ powers in $(\mathbb{Z}/p\mathbb{Z})^{*}$, which has index at most $n$, we obtain the following as a corollary.

Corollary: Fix a positive integer $n > 2$. For any sufficiently large prime $p$, there exists a nontrivial solution to $x^n + y^n \equiv z^n \bmod p$.

## Update, and the combinatorics of quintic equations

A brief update. I’ve been at Cambridge for the last week or so now, and lectures have finally started. I am, tentatively, taking the following Part II classes:

• Riemann Surfaces
• Topics in Analysis Probability and Measure
• Graph Theory
• Linear Analysis (Functional Analysis)
• Logic and Set Theory

I will also attempt to sit in on Part III Algebraic Number Theory, and I will also be self-studying Part II Number Theory and Galois Theory for the Tripos.

As far as this blog goes, my current plan is to blog about interesting topics which come up in my lectures and self-study, partly as a study tool and partly because there are a few big theorems I’d like to get around to understanding this year and some of the material in my lectures will be useful for those theorems.

Today I’d like to blog about something completely different. Here is a fun trick the first half of which I learned somewhere on MO. Recall that the Abel-Ruffini theorem states that the roots of a general quintic polynomial cannot in general be written down in terms of radicals. However, it is known that it is possible to solve general quintics if in addition to radicals one allows Bring radicals. To state this result in a form which will be particularly convenient for the following post, this is equivalent to being able to solve a quintic of the form

$\displaystyle y = 1 + xy^5$

for $y$ in terms of $x$. It just so happens that a particular branch of the above function has a particularly nice Taylor series; in fact, the branch analytic in a neighborhood of the origin is given by

$\displaystyle y = \sum_{n \ge 0} \frac{1}{4n+1} {5n \choose n} x^n$.

This should remind you of the well-known fact that the generating function $\displaystyle y = \sum_{n \ge 0} \frac{1}{n+1} {2n \choose n} x^n$ for the Catalan numbers satisfies $y = 1 + xy^2$. In fact, there is a really nice combinatorial proof of the following general fact: the generating function $\displaystyle y = \sum_{n \ge 0} \frac{1}{(k-1)n+1} {kn \choose n} x^n$ satisfies

$y = 1 + xy^k$.