Interestingly, known public-key cryptosystems all seem to depend on the difficulty of the hidden subgroup problem. Suppose you have a group that can observe, and a subgroup that you cannot observe. Instead, you have a function that is constant along cosets of the group and different for different subgroups. The hidden subgroup problem is to compute a generating set for the subgroup just by evaluating the function. The problem generalizes integer factorization, the graph isomorphism problem and the problem of finding the shortest vector in a lattice. An efficient algorithm would apparently crack all known public-key cryptosystems.
Chris Lamont has a survey paper on the hidden subgroup problem in quantum computing, one that does not assume any background in quantum mechanics. Dave Bacon has some thoughts on an alternate approach.
I was under the impression that the uncracked public-key cryptosystems were all based on number theory, which made them vulnerable to variants of Shor’s algorithm. Yesterday I learned via Dave Bacon that there are cryptosystems based on the hardness of finding the shortest vector on a lattice. Here is a survey paper on the subject by Oded Regev. There is also the McEliece cryptosystem, which is based on coding theory.
We here at Ars Mathematica set back the course of complexity theory by several hours, and Scott Aaronson thanks us for it.
Logician Solomon Feferman, who did his PhD under Alfred Tarski, has just written a short article on Tarski’s influence on computer science.Â It seems that this influence was wider than just that between cylindrical algebras and relational database theory.
This thread about famous women mathematicians on Cocktail Party Physics, reminded me of an interesting figure in history that I came across while doing researching for a Wikipedia article: Grete Hermann. (The Wikipedia article is a skeleton that I created; it could use a lot of work.)
Hermann was a student of Emmy Noether. Noether was one of the iconic figures of twentieth-century mathematics, a key figure in the century’s trend toward abstraction. A typical example is her proof of the Lasker-Noether theorem. The theorem, that every ideal has a primary decomposition, was originally proven for polynomial rings by Emanuel Lasker, using a difficult computational argument. Noether identified the key abstract condition behind the result — the ascending chain condition on ideals — and used it to give a shorter proof of a much more general theorem. Rings that satisfy the ascending chain condition on ideals are now known as Noetherian rings in her honor.
While Hermann was Noether’s student, her thesis was a throwback to the nineteenth century’s computational approach. Hermann showed that Lasker’s approach could be turned into an effective procedure for computing primary decompositions. Hermann did this before the invention of the computer, or even before the notion of an effective procedure had been formalized. (As her definition, Hermann used the existence of an explicit upper bound on time complexity, and gave such a bound for primary decomposition, and other questions in commutative ring theory.)
Hermann went on to work in philosophy and the foundations of physics. John Von Neumann had proposed a proof that a hidden variable theory of quantum mechanics could not exist. (A hidden variable theory is one that explains the random behavior of quantum mechanical systems in terms of unobserved deterministic variables.) Hermann discovered and published the flaw in Von Neumann’s proof back in 1935, a result that has no impact until it was rediscovered by John Bell some thirty years later.
(The thread on Cocktail Party Physics is instructive for just how unfamous mathematicians really are. For physicists, Karl Weierstrauss is an obscure historical figure. For mathematicians of course, Weierstrauss is five times as famous as Madonna and Britney Spears combined. It was interesting to learn that Sofia Kovalevskaya is not particularly well-known among physicists, even though part of her research was in classical mechanics.)
This post by Dave Bacon on his weblog, makes quantum computing sound like a modest extension of classical computing, which works by speeding up computation of Fourier transforms on Z/2Z: quantum computers can be built up out of two different gates, the Toffoli gate (which is universal for classical computation), and the Hadamard gate, which implements the Fourier transform on Z/2Z. The full discrete Fourier transform can be built out of this special case.
Dave links to a short proof of the universality of this family of gates by Dorit Aharonov.
Alerted by a post of sigfpe, I learnt about George Spencer-Brown’s 1972 book Laws of Form.Â Reading Louis Kauffman’s accountÂ of the theory, I was struck by the similarity to Robin Milner‘s theory of bigraphs (see here for papers).Â FromÂ a talk I heard him give a few years ago, I believe that Milner’s theory wasÂ originally intended as a rigorous category-theoretic account of hyperlinks in computer networks.Â Has anyone explored the connections between these two mathematical theories?
Statisticians among us may be interested in Gapminder, free data visualization software from Sweden developed originally to assist in communicating information about global development. The software makes use of web animation tools such as Flash.