While Googling something or other, I came across Nine Chapters on the Semigroup Art, which is a leisurely introduction to the theory of semigroups. (While the document is labelled “lecture notes”, the typography is quite beautiful.)
Happy New Year! The publicity in the wake of Grothendieck’s death has left a certain number of non-mathematicians with the question of what it was exactly that he did. I wrote an answer elsewhere that people seemed to find informative, so I’m saving it here for posterity.
This post is as untechnical as I could make it. Grothendieck’s work is incredibly technical, even by modern standards of abstract mathematics, so my description is, if you’re being charitable, highly impressionistic, and if you’re not, wrong in many major details. I also only discussed schemes and the Weil conjectures, which is only part of what Grothendieck is famous for.
Since Descartes, a major topic of mathematics research is understanding the solutions to polynomials equations. Descartes observed that while finding solutions is a matter of algebra, that when you view all of the solutions together, you enter the realm of geometry. For example, the set of solutions to X2 + Y2 = 1 is a circle.
The set of solutions to one or more polynomial equations is called a variety, and the study of such things is called algebraic geometry.
Originally, algebraic geometry involved solutions in real or complex numbers. (Usually the complex numbers, because that turns out to be much easier, since you can freely take square roots, etc., without having to worry about signs.) But the only things you need for the definitions to work is that you can add, subtract, and multiply. (A set where you can add, subtract, and multiply is called a ring.) There are lots of rings.
So Grothendieck set out to generalize algebraic geometry to arbitrary rings. His generalization of a variety to this setting is called a scheme. Interestingly, if you start with a variety (over the complex numbers), there’s a standard way to associate a ring with it, and in that case Grothendieck’s construction doesn’t give you anything new. It’s for the other kinds of rings that you get something new. So there’s a partial dictionary between varieties and rings, and schemes are missing entries in the dictionary.
Another example of a ring is the integers — you can add, subtract, and multiply integers. Here the idea of schemes captures a weird idea that goes back to the nineteenth century. The scheme for the integers consists of one point for each prime number. So you can picture the integers as points on a straight line at 2, 3, 5, 7, … and nowhere else. (Physicists would put an extra point at 9, and Grothendieck himself would put an extra point at 57.) So schemes are naturally related to number theory, and in fact have helped proved theorems in number theory such as Fermat’s Last Theorem.
On to the Weil conjectures. Think of clockwork arithmetic. You can add, subtract, and multiply hours or minutes on a clock face. In each case, you do the arithmetic with ordinary numbers, and then you throw away multiples of 12 (for hours), or 60 (for minutes). This operation of throwing away multiplies is called the “modulo” operator. So 7 times 2 modulo 12 is 2.
There are a couple of other instances of the modulo operator that you’ve probably used without knowing about it. Taking the last digit in a number is the same as that number modulo 10. So 1234 modulo 10 is 4. Adding up the digits of a number is the same as modulo 9. If you ever learned the trick to check if a number is a multiple of 3 by adding up the digits and checking that, you are actually working modulo 9.
Numbers modulo N give you another ring — you can add, subtract, or multiply modulo N, and that gives you another number modulo N.
What’s nice about numbers modulo N is that there are finitely many of them. They’re also useful in number theory. Let’s say that you want to know there are solutions to some polynomial equation over the integers — say X3 + Y3 = Z3. One easy check is see if there are any solutions modulo N. If there aren’t, then there aren’t any solutions at all. So an interesting question for number theory is how many solutions are there modulo N?
Andre Weil (whose sister was Simone Weil) conjectured a kind of formula for the number of solutions modulo N. He did so via a far-fetched analogy with topology.
Take a disk (a filled-in circle), and consider a continuous map of the disk to itself. One example of a continuous map is a rotation, where you spin the disk around its middle. The point you spin it around is a fixed point — it doesn’t move. You can prove (and it’s a difficult theorem) that every continuous map has to have at least one fixed point. There is a more general formula, called the Leftschetz fixed point formula, that allows you to count the number of fixed points in general (for shapes more complicated than disks).
For the integers modulo N, you can add, subtract, and multiply, but you can’t always divide, and you can’t always do things like take square roots. (Here, x is the square root of y modulo N if x*x is y modulo N. So 3 is the square root of 2, modulo 7. Pretty weird, huh?)
The division problem is easily fixed — just make N be a prime. The root problem is harder to solve, since some numbers don’t have square roots, cube roots, etc. even if N is a prime. The solution is to add “imaginary numbers” modulo N, the same way that we add i, the square root of -1 to get the complex numbers. The complex numbers have an operation defined on them, called conjugation, that sends i to -i. There’s a similar operation modulo N, called the Frobenius automorphism.
Weil said that we pretend that working modulo N was a kind of space, then we could apply the Lefshetz fixed point theorem, and count the number of solutions. This is a completely far-fetched anology, because there’s no geometry here.
That’s where schemes come in. Schemes supply the missing geometry. Grothendieck showed how to generalize the topological techniques to this setting so that a version of the Lefschetz fixed point theorem could be proven to settle the Weil conjectures. The proof is absurdly hard and abstract, but it is related to a relatively concrete question. (Unfortunately, the formula the conjectures give you is it itself a bit hard to use, so I don’t know any easy explanation of what it means, but I think it does have some real-world applications in coding theory and cryptography.)
George Lowther at Almost Sure has written a terrific series of posts explaining stochastic processes and the stochastic calculus. Stochastic calculus is widely used in physics and finance, so there are many informal introductions that get across the main ideas in a form sufficient for applications. Most of the formal presentations of the subject seem very far away from the informal ones, to an unusual extent. For example, for the important technical notion of semimartingale the Wikipedia definition is the usual one, which has a very different flavor from the naive picture useful in applications. Lowther introduces it directly in terms of the stochastic integral, and the stochastic integral itself is introduced as a limiting process of random sums of a particularly simple form. The random sums are pretty much the same things you would write down in a naive presentation.
As is well-known, the lattice of submodules of a module is modular. What I did not know is that the converse is not true, and that lattices of submodules must satisfy a stronger property, the arguesian law.
The Arguesian law is a lattice-theoretic analogue of Desargues’ theorem in projective geometry. I read the statement of the theorem several times and I have no intuition about what it means.
There is a kind of converse to this result: a complemented lattice can be embedded into the lattice of submodules of a module if and only if it is arguesian. (I found the result in Gratzer’s book on lattice theory, which is viewable in Google Books.)
Chandan Singh Dalawat has a nice survey article about K2. It just gives the highlights of the theory, without proofs, so it’s closer to a teaser trailer than it is to full-length movie. But sometimes you just want a teaser trailer to tell you if you want to invest the time in the movie.
I came across this terrific article that describes a sequence of results beginning with Pappas’ theorem through the Cayley-Bacharach theorem to modern formulations in terms of the Gorenstein (!) condition.
The connection between classical topics in algebraic geometry and modern techniques is fascinating.
I periodically feel like I should learn more about nonassociative algebra. (I’ve studied Lie algebras, and technically Lie algebras are non-associative, but they’re pretty atypical of nonassociative algebras.) There’s a mysterious circle of “exceptional” examples that are all related — the octonions, the five exceptional Lie algebras, the exceptional Jordan algebra — that I would like to understand better. John Baez has an article about the direct connection that I post about before, but what I don’t understand about the general theory is how relaxing assocativity gives you so few new examples.
I previously linked to an article classifying the simple Moufang loops. The only examples that are not groups are again related to the octonions.
One of my ambitions in life is to understand projective determinacy. Fortunately, Tim Gowers has written a series of posts to explain Martin’s proof that Borel sets are determined.
The main source of interest in determinacy is that results suggest that it is the strongest regularity property that a set can have, in that it it tends to imply other nice properties such as Lebesgue measurability. Here is a short proof by Martin that determinacy implies Lebesgue measurability. Justin Palumbo has a nice set of lecture notes that relate determinacy to other regularity properties.
(One nuance is that determinacy for a single set usually doesn’t imply strong regularity properties — the proofs typically require several auxiliary games for a single set. The Martin and Palumbo links use the setting of the axiom of determinacy, which is the axiom that all sets are determined. This is actually false in ZFC: it contradicts the axiom of choice. There are analogous results that hold in ZFC where you keep track of which sets you need to have determined.)
I came across a number theory paper Twists of X(7) and Primitive Solutions of x2 + y3 = z7 that I find completely fascinating. I find it fascinating because a) the question is so easy, b) the answer is so hard, and yet c) someone was able to answer it.
An earlier expository paper, Faltings plus epsilon, Wiles plus epsilon, and the Generalized Fermat Equation, talks about the general question of finding solutions to the “generalized Fermat eqution”, xp + yq = zr.
Ugh, I suck at this blogging thing. I periodically get ambitious, and make big plans. That doesn’t actually lead to any completed posts, just many long half-finished posts, and hundreds of open tabs in Firefox. I think I’ll start with some short posts.
Linear types are one of those things that I’d always wanted to learn more about. The idea seems somewhat natural &emdash; practically speaking the amount of resources an object uses is part of its signature &emdash; but the details are sufficiently complex that I’ve never quite mastered it. This presentation by Francois Pottier seems like a nice place to start.
Via Lambda the Ultimate.