Bill Gasarch is right – writing technical posts is tiring! (I’ve been trying to finish the next GILA post for days.) So I’ll share some more thoughts instead. Today’s thought was triggered by David Corfield:
In the first of the above posts I mention Leo Corry’s idea that professional historians of mathematics now write a style of history very different from older styles, and those employed by mathematicians themselves. …
To my mind a key difference is the historians’ emphasis in their histories that things could have turned out very differently [emphasis mine], while the mathematicians tend to tell a story where we learn how the present has emerged out of the past, giving the impression that things were always going to turn out not very dissimilarly to the way they have, even if in retrospect the course was quite tortuous.
This in turn reminded me of something else Rota wrote about his Walter Mitty fantasies:
Let us begin with a piece of history-fiction, and imagine how Riemann might have discovered the Riemann zeta function.
Professor Riemann was aware that arithmetic density is of fundamental importance in number theory. If
is a subset of the positive integers
, then the arithmetic density of the set
is defined to be
whenever the limit exists. For example,
. If
is the set of multiples of the prime
, then
; what is more appealing is if one easily computes that
for any two primes
and
. If density were a (countably additive) probability measure, we would infer that the events that a randomly chosen number is divisible by either of two primes are independent [emphasis mine]. Unfortunately, arithmetic density shares some but not all properties of a probability measure. It is most emphatically not countably additive.
After a period of soul-searching, Professor Riemann was able to find a remedy to some deficiencies of arithmetic density by a brilliant leap of imagination. He chose a real number
and defined the measure of a positive integer
to equal
; in this way, the measure of the set
turned out to equal
.
Therefore, he could define a (countably additive) probability measure
on the set
of positive integers by setting
.
Riemann then proceeded to verify what he had sensed all along, namely the fundamental property
.
In other words, the events
and
that a randomly chosen integer
be divisible by one of the two primes
or
are independent relative to the probability
.
The Riemann zeta function was good for something after all.
…
Long after Riemann was gone, it was shown, again subject to technical assumptions, that the probabilities
are the only probabilities defined on the set
of natural integers for which the events of divisibility by different primes are independent. This fact seems to lend support to the program of proving results of number theory by probabilistic methods based on the Riemann zeta function.
Rota doesn’t quote any sources, but this is a brilliant piece of insight which Rota goes on to dwarf with an even better one. But I want to use this example because I want to see more writing like this, and I’ll tell you why.
I don’t believe that any particular development in mathematical history was inevitable. I don’t believe that either the mathematical history or the axiomatic presentation of a subject are the only things to say about its origins. As the mathematical community takes a new idea, such as category theory, and begins to apply it to fields other than the field in which it originated, we gain hindsight. The problem with focusing on axioms, as I have said, is that an axiomatic treatment of a subject tends to ignore that this process of gaining hindsight ever happened. It also lends to a view of mathematics as “God-given,” as opposed to being driven strongly by human concerns and human modes of understanding. On the other hand, abstraction and generality are our best tools for compressing mathematical insight, and often it’s not clear what the clearest or most general way to study a particular phenomenon is until decades after the first attempt.
With these concerns in mind, I want to propose a way of introducing certain concepts or subjects based on historical fiction. Instead of axioms or the actual historical progression of a concept, perhaps it would be instructive to invent a fictional history fueled by modern hindsight. For example, identify a mathematician who was in a position to discover a concept but lacked a key insight, and imagine, as Rota did, that the insight was available. For simpler concepts, it might suffice to tell a story with the reader as protagonist about how one might reasonably go about writing down the axioms of, say, a group.
Perhaps I’m just asking for people to motivate things better in general. But does anyone know of any textbooks or expository articles employing primarily this strategy? I would love to see a textbook begin every chapter with a story.
Here there is another very nice example of mathematical thinking fiction http://gowers.wordpress.com/2010/12/09/finding-cantors-proof-that-there-are-transcendental-numbers/
18.314 Professor Rota circa 1977 using ‘Generalissimo Chung’s Book’ as Rota called it! Good times. RIP G.C. Rota!
[…] function! (This is one way to flesh out the tantalizing statement made by Rota which I quoted in this old blog post, although I don’t know if it’s the precise result that Rota had in mind.) This […]
[…] and more importantly, this is to be a piece of historical fiction. Although I don’t know how Roth came upon his original proof, he was a number theorist and an […]
I love that excerpt from Rota. It seems much less mysterious to me now why the Riemann zeta-function is of such importance to number theory. Thanks for sharing!
I’d argue that emphasizing fictional history (which comprises, for example, most Calculus texts) is very dangerous. Not because I think that we necessarily want history to be “true”. But because one of the problems of explanations-with-hindsight is that it gives a very wrong impression of how mathematicians (like Riemann) come up with new ideas. I’m particularly worried about senior undergrads / early graduate students just starting to “do research”: an important part of learning to do research is to realize that The Greats also spent years not getting anywhere.
Theo, I agree that calculus as it’s currently taught is beyond stupid (let’s face it — no one would think of the epsilon-delta definitions unless they absolutely had to, which they did, but 99% of calculus students don’t need to know that) but I think that fictional mathematical history doesn’t have to obscure insight. For instance, I’d argue that it borders on criminal to teach Galois theory from the permutation group/solution of equations perspective instead of the infinitely more elegant field theory perspective, even though it was originally developed to solve problems relating to solution of equations!
Qiaochu: I feel like I’ve seen articles pretty similar to what you describe, but I can’t for the life of me remember what they were. Maybe I’ll write one!