Showing posts with label computation. Show all posts
Showing posts with label computation. Show all posts

Friday, July 27, 2007

In case of power outage

In the EECS building on the Berkeley campus...

Monday, May 21, 2007

Delinquency

For those distraught by the dearth of mathematically themed writings in this space, may I suggest my shared items in Google Reader, easily accessed either by the first of the above links, or in the sidebar of this page, entitled "And what have I been reading?"

My distractions from these posts have not been all work. I've written before about the great strides being made in human-computer interaction, especially regarding tools for musical creation. Last Saturday night I had the extreme pleasure of seeing Björk (or B. Guðmundsdóttir, for the sake of nomenclatural purity) perform at the Shoreline amphitheater, with M. Bell at the helm of a reactable. This instrument, first pointed out to me by J. Hopper (who is inexplicably nigh invisible to Google), is similar to the audiopad from MIT's media lab, but with a crucial difference: it is out of the prototype stage, and in front of a mainstream audience. Hopefully commercialization is not too far off.

Tuesday, March 20, 2007

Build character!

A few research institutions and news outlets, as well as some blogs have reported on what is generally being termed the "mapping" of E8. This is about as good a layman's description of this result as can be given, although for those who know a smattering of group theory, it would be better to say that the fruit of these labors is a table relating the conjugacy classes of E8 to each other. Seeing as there are 453,060 such classes, this requires "writing down" 205,263,363,600 numbers; these are enough to cover Manhattan, and they require 60 times more storage than the human genome, as the researchers and the reporters to whom they've spoken are fond of pointing out.

I can't help but notice the striking resemblance between some of the pictures from the atlas project and the output of one of my projects. Granted, you can't use it to generate E8 crystals, but then again, it's just running on a simple webserver.

And by the way, if you or anyone you know would like to take over the development of this software, please contact me. This was the start of something really good, but other concerns have taken center stage since then.

Tuesday, February 27, 2007

Would it help if I drew a flow chart?

Such great things come out of MIT's media lab: one of their visitors has built a water-based computer.

This could come in handy in the event of a massive electromagnetic pulse. However, I suspect most people would have other matters on their mind at that point.

In all seriousness, this is a wonderful project. Like other mechanical algorithm devices, it allows the fundamental procedures of computation to be animated in space, relieving the burden of mental modeling from the executor.

But remember, if you're seated in the first six rows, you may get wet during this subroutine.

Wednesday, January 31, 2007

Shaken to the (multi) core

Catherine Crawford, chief architect for next-generation systems software at IBM Systems Group's Quasar Design Center, has made a bold statement about the future of software; as is to be expected, some sources make it sound a little more sensationally apocalyptic.

This isn't the first time I've brought up the many-core future of the desktop. You may be wondering who will have the expertise to write the software that will take advantage of this massively parallel paradigm. Well, that will be just one skill that I'll be developing while a postdoctoral fellow at a very fine institution.

Wednesday, January 24, 2007

Let the CHI flow

For the purposes of this post, that's Computer-Human Interaction, not the Chinese concept of life force.

I made my way to PARC again last Thursday to hear a great talk about how the landscape of electronic entertainment is changing thanks to developments in HCI. It really says something about my interests that so many of the topics that T. Blaine covered were already familiar to me. Off the top of my head, these include: Guitar Hero, Karaoke Revolution, the DS, the Wii,
D'CüCKOO, audiopad, and Jeff Han's multi-touch interface. What ties her interests together amongst each other, as well as with mine, is how custom hardware can facilitate musical creation through intuitive human manipulations.

Of all the above, only Karaoke Revolution uses the single most intuitive human tool for sound, the voice. One of the more interesting projects underway in the realm of speech-and-song control of computers was covered in this space earlier; unfortunately, this wonderful creative tool is still not publicly available.

However, an even greater voice-controlled application has just been released. Have you ever heard a song on the radio, but not caught the attribution, only to have that catchy hook running through your head, leaving you wishing that you knew who wrote the song? Thanks to midomi, you need wonder no more! Just sing into your browser and find all the covers of "Fly Me to the Moon", or who's done that "Doo-wah-ditty-ditty-dum-ditty-doo" song. They're still in "invitation only" beta, but not to worry; if you want me to get you past the velvet rope, just let me know in the comments.

Friday, January 19, 2007

Synchronicity

The other day, I took the time to read S. Yegge's latest essay on software, complex systems, and consciousness; the thesis of the work, an idea that Steve has been refining for some time now, is that "the most important principle in all of software design is this: Systems should never reboot." He gives numerous examples of software products that fail to incorporate this principle, and a few that provide a weak, half-hearted attempt at it. He then goes on to explain that given his (quite reasonable) definition of software, the best systems have this idea built into their core:
So my first argument against rebooting is that in nature it doesn't happen. Or, more accurately, when it does happen it's pretty catastrophic. If you don't like the way a person works, you don't kill them, fix their DNA, and then regrow them. If you don't like the way a government works, you don't shut it down, figure out what's wrong, and start it back up again. Why, then, do we almost always develop software that way?


To celebrate turning 30, R. Stevens created a new t-shirt design (which is now available). Imagine my surprise to find this in his official announcement:
We all have pretty much the same personalities we were born with, just earlier versions. Our software never really gets rewritten, it just evolves.


Rich, Steve, allow me to introduce you to each other.

Thursday, January 11, 2007

Touch you I

If you're reading this, you must have heard Tuesday's big news. Even if you weren't following it on twitter or the web, you found out soon enough, either from a major news outlet or by reading any of the thousands of blogs discussing it.

That number is no exaggeration: as of this writing, technorati lists 83,622 posts, and Google Blog Search returns about 86,006 results from the past three days.

If you've been following this space long enough, you've already seen some aspects of the interface to this device. The "pinch" zoom paradigm looks just like the light box tool that J. Han's lab put together, so you can proudly tell all your friends that you knew about this whole multi-touch UI way before Apple brought it into the mainstream, which suddenly made it so much less cool.

But that gloat leaves you open to one-upmanship, as I just found out that J. Han was not the first to implement such gestural control of a computer. Over at Microsoft Research, A. Wilson had a working demo of TouchLight in late 2004. If anyone knows of an earlier claim to this concept, please let me know; otherwise, cheers to A. Wilson's ingenuity and creative spark!

While proper credit is due the originator of any idea, it is equally important to note the astounding progress each of these iterations accomplished. The product from the Courant Institute took some great HCI ideas and built a robust interface library and scalable hardware around them, inviting software developers to join in the fun. Now, Apple Computer Inc. has managed to pack that technology into less space than a satisfying meal, along with wireless transmitters, a camera, an accelerometer, etc., and to make it all as attractive as we expect Apple products to be.

It is crucial to remember that the well executed embodiment of an idea is itself a creative task, often no less challenging than the development of the original idea.

Friday, December 29, 2006

Howdy, neighbor

Constructing phylogenetic trees from distances is crucial to computational genomics. That's why this paper is such a comfort; R. Mihaescu, D. Levy, and L. Pachter have shown that the neighbor-joining algorithm accomplishes this task very quickly, at least in an asymptotic and probabilistic sense.

I once overheard the third author wish to see his website at the top of the list of Liors. Back then he was fourth, as of this writing, he's made his way to the number two spot. I'm willing to do my part to help his dream become reality.

Friday, November 17, 2006

Return fire!

M. Diaby claims to have invalidated R. Hofman's refutation of his purported proof of P=NP that was discussed earlier. It's not so much an article as an invitation to a public shouting match, seeing as it lacks an abstract, or references, or even self-contained expository prose. I hope Hofman responds again; this could get interesting.

Thursday, November 16, 2006

What's a four-letter word starting with "R" for "Computer technology developed by D. Patterson"?

It all depends on your interests: if data storage is your thing, the answer is RAID; if you like compilers and microprocessor architecture, you're probably looking for RISC. If, on the other hand, you're taken by what the future might hold for parallel computing, the correct acronym is RAMP; Research Accelerator for Multiple Processors. If you want to continue to be impressed, look over the short version of D. Patterson's biography; he also has a slightly longer version, illustrating that impressiveness can scale linearly.

I had the great pleasure to attend Patterson's talk at PARC a week ago on the view from Berkeley. I strongly encourage you to watch the video linked from the above page or look over the accompanying slides; in them the audience is treated to the old and new conventional wisdoms on processor design, an explanation of the sudden π-turn the industry has made regarding parallelism, and visions of a future with 1000's of processors (or cores, as some call them) on a single chip; this scenario (termed "many-core" by Patterson) is a far cry from the so-called "multi-core" chips emerging on the market right now.

This is good news for the computational science community; for one thing, it means that we're not going to run into the "brick wall" of power consumption, memory accessibility, and instruction-level parallelism, so scientific applications will continue their exponential march toward ever-greater computational power. For another, it means there's plenty of work to be done in adapting the fundamental algorithms behind the computational work that goes on these days.

And what does it mean for the general, non-scientific computer-using population? At first, it may seem that computers are fast enough, hard drives large enough, etc., for most of what anyone could ever need. However, all it takes is some ingenuity to put all that power to good use; another great example of this sort of imagination was already discussed. If you've ever edited photos, audio, or video, or waited an hour to extract the music from a CD, you've wanted the power that many-core processors may be able to deliver. When asked "How fast is fast enough?", L. Ellison said "In the blink of an eye." We clearly have a long way to go before the current generation of applications meet that standard; hopefully, many-core technology will carry us a long way along that road.

Wednesday, November 08, 2006

Get your hands on some games

One of the most common expository metaphors of discrete mathematics is "playing a game". Sometimes this is taken quite literally, as with the algebro-geometric algorithm jeu de taquin, the enumerative object named for an 8-bit Nintendo game, and the patience sorting algorithm, which American audiences might prefer to call the Klondike sorting algorithm. It shouldn't be surprising that some discrete games have been a rich source of interesting mathematical questions, and that a certain game with a long history still poses academic challenges.

And speaking of games, an excellent source of casual games is J. Bibby's site Jay is Games. I discussed one of their recommended games earlier, although I came across it independently of JiG.

In Planarity, I often have planarized a subset of the vertices and want to move them all at once. If only the interface would let me manipulate more than one vertex at a time! J. Han has put together a device implementing a multi-touch interface in his lab at the Courant Institute; in addition to running silly graph-theoretic games, it augurs what may very well be the next paradigm in human-computer interaction. In case the video on that page isn't enough to make you covet the tenth-generation iMac, take a look at this live demo. He mentions while illustrating the "puppet" application that cutting-edge mathematics and computational science make those dancing drawings possible. We are finally reaching the stage where the massive computational power sitting on the desktop can be harnessed to make computers behave in a way that is convenient to us, rather than the other way around. This is true innovation.

Thursday, October 26, 2006

IP $\neq$ LP

And in case you might have heard otherwise, we still don't know about P and NP. There are some recently submitted papers (all of which are rather light on references) in the Computational Complexity section of the CS arXiv claiming that P and NP are the same thing, based on formulating the traveling salesman problem as a linear program. Of course, they all make the classic mistake of underestimating the importance of constraining the variables to be integers.

The traveling salesman problem (or the TSP, an abbreviation that reduces the running time of saying its name by a factor of 2) asks simply "What is the shortest journey that makes exactly one stop in every city on a list?" Lest you think this problem is hopelessly academic, peruse this description of a few of the applications of the TSP. And lest you think that this problem seems simple, consider that $1,000,000 rests on its resolution.

I suppose it's not too shocking that a glory-seeking Associate Professor and an unknown researcher (is he a graduate student? an employee of a call-center software company?) would throw their hats in the ring. Hopefully they will learn that integer programming (IP) is not the same thing as linear programming (LP).

Fortunately, R. Hofman has set the record straight. Now, will those other two fellows follow the directions for withdrawing a paper from the arXiv?

Update 11/3/2006: R. Hofman has composed a more general rebuttal to those authors that have used linear programming to claim that P=NP.

Friday, August 18, 2006

Stacks!

Stacks are one of the most fundamental abstract data structures in computer science, providing most with their first exposure to dynamic memory allocation. And as E. LaForest explains, they can also serve in place of registers in a processor core, potentially with a dramatic performance benefit. I'm curious to see how far this technology can go; will we have stack-based handhelds in ten years? Will CS undergrads bemoan their Forth class as they now do of C++?

Friday, July 21, 2006

High Performance

I now have an account on Jacquard for the purpose of parallel programming, which I'll be getting into later this summer. I would imagine it's not unlike sitting in the driver's seat of a Lotus.

Those who are concerned about how I might use all this power can rest easy; I've agreed not to use it to develop weapons of mass destruction, and I am forbidden from using it on behalf of citizens of certain countries.

Monday, April 17, 2006

The new generation of audio ransom note

Wired's music guy, E. Van Buskirk, interviewed S. König about his software project cum musical mash-up collage tool cum political statement on intellectual property, sCrAmBlEd?HaCkZ!. It's not only an interesting idea, but there are also numerous computational challenges to making it work, and work so well. I want to know how he does all that! I suppose I'll find out when it gets sourceforged.

Freed from economic and social constraints, I would volunteer to work on this without a second thought.

Wednesday, March 08, 2006

Look, biological imaging is actually useful

A group led by David Shapiro of SUNY Stony Brook has developed a new algorithm for image reconstruction from X-ray diffraction microscopy. And as has been reported on Cornell's news site, it is easily adapted to quickly solve sudoku puzzles.

So, will a sizable number of young Americans realize that combinatorialists solve puzzles like sudoku, and decide that they want to pursue discrete math as a major, or even a career?