Thursday, November 30, 2006

Elucidating low technology

All manner of news outlets are reporting on the just announced unraveling of the heretofore mysterious Antikythera mechanism, rediscovered over 100 years ago after falling to the bottom of the Sea of Crete nearly 2000 years before that. Network World has a few photos of the X-ray and tomographic technicians at work, and Wired has put up some beautiful pictures of the device produced from Hewlett-Packard's gallery of reflectance images, in which the user can control how the object is lit. Especially interesting are the fragments of documentation etched into the works, although I must say, it's all Greek to me.

As long-time readers know, I have a soft spot for collisions between the old and the new. It's rather appealing to see the advances in imaging technology over the past century reverse the effects of the elements grinding away for millennia. Consider how lucky we are that it was recovered late enough in history that non-destructive methods were used to study it; it's not too hard to imagine a Victorian-era engineer attempting to take it apart or washing it with baking soda and vinegar.

Having recently read Guns, Germs, and Steel, I can't help but be reminded of the Phaistos disk, another artifact illustrating that Cretan technology was far ahead of its time. It would seem that in both cases the adaptive advantage offered by adopting the new technology (in the case of Phaistos, movable type; in the case of Antikythera, geared wheels) was insufficient to merit the labor required to implement them. The criteria for an innovative idea to be "good" depends on context much more than many people realize.

Monday, November 27, 2006

The other half

Appropriately enough, on the day that I filed my dissertation, a paper went up completing my work on characterizing B2-crystals by local criteria. It's great to see this, since my article proved only half of the desired result. As I said before, it's good to have publicly visible work; if I didn't, I would not have been given due acknowledgment both in the references and the introduction.

Especially exciting is the possibility that the approach taken by V. I. Danilov, A. V. Karzanov, and G. A. Koshevoy might be generalizable to bigger algebras, so we might see a local characterization of G2-crystals sometime next year!

Wednesday, November 22, 2006

Ph.inally D.one

There's another F-word that I could have used to title this post, but I know that some of my readership prefer more Ph.amily-Ph.riendly language.

It's hard to get a Ph.D., and not just in the sense that a lot of work goes into the coursework, finding a topic, making an original contribution to the field, etc. After it seems that everything is done; after all 130 pages have been edited, re-edited, and verified by three committee members; after it's been printed on at least 20# paper with at least 25% cotton content; after the cover sheet has been signed by all three members (one of whom is in Australia, another of whom is now in Paris); after two additional copies of the abstract (without page numbers) have been printed; the work isn't done. There are two anonymous questionnaires, one from the NSF and one from UC Davis; there is a release form authorizing the powers that be to copy the document that I spent the last four years preparing onto microfilm and bind it in the library; there is an appointment (between the hours of 1:00 and 4:00 on Tuesday) to be made with one of the staff in the office of graduate studies, wherein the above forms are put in order to be passed on the appropriate offices, and every page of my dissertation is examined to ensure that the pages are consecutive, as are the chapters, and the sections, and the subsections, and that my font is suficiently large and uniform throughout, and that the margins are respected (god forbid you disrespect the margins!). Frankly, if this kind of administrative scrutiny was demanded of bachelors recipients, the numbers for that degree would be much lower. But if you can make it through all of that, then you will have satisfied all requirements of the degree of Doctor of Philosophy, and I'm proud to report that I have done precisely that.


I was somewhat surprised by my elation when D. Swindall handed me that certificate. For many months now, when asked how far along I was, I've said something like "It's mostly paperwork at this point," or "I just have some administrative details to take care of," or "I'm down to minor editing." After saying such a white lie so many times I started to believe that the work I had left really was negligible, and that, for all intents and purposes, I was done. I had even walked; how much could that feeling of completion be enhanced by mere paperwork?

A whole lot, as it turns out. After spending those fifteen minutes yesterday witnessing the inspection of my dissertation's pages, I could say, for the first time without qualification, that I am Dr. Philip Max Sternberg. I'm now as educated as my wife, (although the jury's still out as to who's the smarter one). I'm also the second Dr. Sternberg in my family, a tradition that my father would be happy to see me carry on. Of course, in his view, the important aspect of this endeavor is scholarship, not the title; I'm glad to have had him instill me with that sense of values. For this, and many other reasons, I'm extremely grateful to have the opportunity to dedicate my dissertation to his memory.


Yesterday marked the end of this stage of my life in another way, too; my Jetta, my first car, my graduation present, the car that carried me through all of grad school, was sold to a young couple, both of whom just started their graduate studies. It served me well, and was with me for many fond memories. But it was time to let it go. I think it was best to have it pass out of my hands at the same time as my dissertation; the sense of completion and finality is made all the more real because of it.

A big day, in so many ways.

Tuesday, November 21, 2006

In my office for the last time

(photo credit: Isaiah Lankham)

Friday, November 17, 2006

Return fire!

M. Diaby claims to have invalidated R. Hofman's refutation of his purported proof of P=NP that was discussed earlier. It's not so much an article as an invitation to a public shouting match, seeing as it lacks an abstract, or references, or even self-contained expository prose. I hope Hofman responds again; this could get interesting.

Thursday, November 16, 2006

What's a four-letter word starting with "R" for "Computer technology developed by D. Patterson"?

It all depends on your interests: if data storage is your thing, the answer is RAID; if you like compilers and microprocessor architecture, you're probably looking for RISC. If, on the other hand, you're taken by what the future might hold for parallel computing, the correct acronym is RAMP; Research Accelerator for Multiple Processors. If you want to continue to be impressed, look over the short version of D. Patterson's biography; he also has a slightly longer version, illustrating that impressiveness can scale linearly.

I had the great pleasure to attend Patterson's talk at PARC a week ago on the view from Berkeley. I strongly encourage you to watch the video linked from the above page or look over the accompanying slides; in them the audience is treated to the old and new conventional wisdoms on processor design, an explanation of the sudden π-turn the industry has made regarding parallelism, and visions of a future with 1000's of processors (or cores, as some call them) on a single chip; this scenario (termed "many-core" by Patterson) is a far cry from the so-called "multi-core" chips emerging on the market right now.

This is good news for the computational science community; for one thing, it means that we're not going to run into the "brick wall" of power consumption, memory accessibility, and instruction-level parallelism, so scientific applications will continue their exponential march toward ever-greater computational power. For another, it means there's plenty of work to be done in adapting the fundamental algorithms behind the computational work that goes on these days.

And what does it mean for the general, non-scientific computer-using population? At first, it may seem that computers are fast enough, hard drives large enough, etc., for most of what anyone could ever need. However, all it takes is some ingenuity to put all that power to good use; another great example of this sort of imagination was already discussed. If you've ever edited photos, audio, or video, or waited an hour to extract the music from a CD, you've wanted the power that many-core processors may be able to deliver. When asked "How fast is fast enough?", L. Ellison said "In the blink of an eye." We clearly have a long way to go before the current generation of applications meet that standard; hopefully, many-core technology will carry us a long way along that road.

Wednesday, November 08, 2006

Get your hands on some games

One of the most common expository metaphors of discrete mathematics is "playing a game". Sometimes this is taken quite literally, as with the algebro-geometric algorithm jeu de taquin, the enumerative object named for an 8-bit Nintendo game, and the patience sorting algorithm, which American audiences might prefer to call the Klondike sorting algorithm. It shouldn't be surprising that some discrete games have been a rich source of interesting mathematical questions, and that a certain game with a long history still poses academic challenges.

And speaking of games, an excellent source of casual games is J. Bibby's site Jay is Games. I discussed one of their recommended games earlier, although I came across it independently of JiG.

In Planarity, I often have planarized a subset of the vertices and want to move them all at once. If only the interface would let me manipulate more than one vertex at a time! J. Han has put together a device implementing a multi-touch interface in his lab at the Courant Institute; in addition to running silly graph-theoretic games, it augurs what may very well be the next paradigm in human-computer interaction. In case the video on that page isn't enough to make you covet the tenth-generation iMac, take a look at this live demo. He mentions while illustrating the "puppet" application that cutting-edge mathematics and computational science make those dancing drawings possible. We are finally reaching the stage where the massive computational power sitting on the desktop can be harnessed to make computers behave in a way that is convenient to us, rather than the other way around. This is true innovation.

Monday, November 06, 2006

Bucking the trend

As the title of my dissertation indicates, I've done research on crystals. Most non-mathematicians think that crystals refer to three-dimensional polytopes with evident symmetries; I've had to tell many a suddenly eager listener that what they call polyhedra are not the topic of my work. There is a path between crystals and polytopes, but it makes a lengthy journey through Lie theory and algebraic geometry (sometimes with a tropical substition), which makes even my head spin.

One reason I suspect that so many educated people feels a certain comfort with 3-polytopes is their diagrammatic use in chemistry. I know I have a fondness for my days spent at a lab bench playing with wooden balls and springs, building crude models for carbon rings and water molecules; I'm sure I'm not the only one with this sentiment.

Geodesic domes are a famous family of polytopes, closely related to buckyballs, named for the man responsible for bringing these structures into the public eye. Not only are they good for housing humans, they can also deliver microscopic payloads when its vertices are atoms and its edges are bonds between them; its facets will be pentagons and hexagons, thanks to the chemical nature of carbon. Generally the pentagons in these nano-scale buckyballs (also called fullerenes) must not be adjacent, as is the case for your garden variety soccer ball. However, a team of chemists (including some at my alma mater) have found an ovoid counterexample.

It seems the motivating application is to get heavy molecules such as triterbium nitride to slip into the human body undetected by encasing them in a carbon cage; we therefore have discreet metals thanks to discrete geometry!

Thursday, November 02, 2006

num·ber: a musical selection

A. Gelfand (any relation to I. Gelfand or S. Gelfand?) has written a pair of stories for Seed and Wired concerning R. Mahanthappa and his new cryptologically inspired album, Codebook. I highly recommend that you give it a listen; between Gelfand's articles, the label's site for the recording, and the artist's myspace page you can listen to most of the numbers on the CD.

I really appreciate his approach to mathematical composition. He describes his initial cipher of J. Coltrane's "Giant Steps" as "unplayable", specifically wishing to avoid "the appearance of random noise." Rather than leave the melody of "Frontburner" in its raw encrypted state, Mahanthappa "had to tweak his coded message until it could be classified as music."

Perhaps it's an overstatement to assert that it wasn't music before the "tweaks", but I'm sure that it's much better music on their account. It's refreshing to see serial elements used as a stimulus for creative expression rather than a programmatic straitjacket.

Wednesday, November 01, 2006

Symptoms of withdrawal

When I first saw mention of a proof of the existence of a solution to the Navier-Stokes equations, I assumed that it had more in common with "proofs" of P=NP or the Riemann hypothesis. In fact, P. Smith had some keen original insights, even if the paper ultimately needed to be withdrawn. There is no shame in this, of course. As G. Kuperberg has illustrated with many examples posted next to his office door, quite capable mathematicians will from time to time push something out the door before it can stand on its own two feet; many of these errors are discovered when a colleague provides a counter-example to one of the central proofs of the article, which I would find considerably more embarrassing than discovering a "serious flaw" in antecedent peer-reviewed literature. Real shame lies in leaving papers up after their erroneousness has been plainly illustrated; as of this writing, this is true of those who wrote about the LP formulation of the TSP just a few weeks ago.

This became a sufficiently hot topic in the blogosphere community that Seed magazine ran a short write-up of the brief history of Smith's work and the comments made from as high up as P. Woit's site. Unfortunately, S. Ornes took the opportunity to sensationalize the matter, suggesting that these events "might ... give mathematicians of the future a strong incentive to be hyper-meticulous about their work", and "[p]erhaps [they] will have to reconsider posting their work on arXiv."

As if there isn't already sufficient incentive. The referee process is ponderous, and it is simply not in anyone's interest, least of all the researcher's, to submit rough work to peer-reviewed journals. Sloppy work will take even longer to make it to press, or possibly be rejected outright. In our "publish or perish" world, six extra months between publications can have a severe impact on a performance review; while on the job market, it can make the difference between being employed and not.

And no one I know will change their practice of submitting to (or defying) the arXiv. The Seed article suggests that many members of the mathematical community will fear the public scrutiny and response that developed in the wake of the Navier-Stokes paper and therefore reduce their use of the electronic pre-print service. The fact is that extremely few of us are doing such high profile work that it will result in attention beyond the regular readership of our subclassifications, and thus have nothing of the sort to fear. Events such as this, as much as they might make for exciting news stories, have a minimal impact on the day-to-day life of working mathematicians.

The mechanisms that I see at work are two-fold: first, the increasing speed of mathematical communication via the internet, and second, the volume of positive attention mathematics has garnered in recent years in the culture at large. We just happen to be looking at a point in history where these two phenomena have interacted more dramatically than they have before, causing an unprecedented event, the speculated impact of which has been tremendously overstated.

Fifty years ago, preprints were mailed to colleagues at remote institutions, journals played a critical role in the communication of new research, and conferences were one of the rare opportunities for intimate interaction between mathematicians from different regions. Now, journals are published both in print and online, almost every author either uses the arXiv or maintains a collection of their preprints on their own website, and email is routinely used to disseminate results or facilitate collaborations. Conferences, while still providing a unique opportunity to travel to exotic locales and spend some uninterrupted quality time with like-minded researchers, no longer play their critical role in discourse, as conference proceedings are made available electronically and air travel costs for one-on-one collaboration are at stupendous historical lows. The trend always has been to take advantage of new technologies whenever they might be useful; the arXiv will maintain its utility, and will therefore see no decline in its usage.

Thanks to movies, television, and the popular press, a larger segment of the population has taken an interest in mathematics than ever before. This means not only that laypeople are more likely to have a demand for news from the math world, but also that more specialists are willing to fulfill that demand (such as in this forum, solipsistic as it may be). The result is one that has been seen time and time again; once a critical mass of anonymous commentators has accrued, the signal will be overwhelmed by noise. It's very sad that in this case some of that noise took the form of personal criticism of Prof. Smith by non-mathematicians who have no experience with the process of mathematics.

We all make mistakes; those of us who live on the cutting edge make a lot of them. We catch most of them before sharing our work with the world, but a few slip through, are caught later, and rectified. It's far better to live by the pursuit of truth than the fear of error.

As a fortune I got after a chinese meal once told me: