Jeff Han works on multi-touch interfaces: touch screens that can recognize more than one point of input, and thus combinations of gestures and so on. Here’s a cool video showing some of the interface methods his company is developing. (Warning: cheesy music.)

You can see some cool possibilities for educational bells and whistles, such as the taxonomic tree one of the operators is seen navigating. The possibilities for high-dimensional dynamic data visualization are also obvious. We see a scatterplot being manipulated with some scaled data points on it, for instance. Something like Ggobi would be fun to use on a system like this. In the near future, the phrase “touchy-feely” may well apply to the quantitative rather than the qualitative crowd.