adaptive interfaces from Microsoft Research

This is one of the demos from Microsoft’s Research TechFest that I forgot to post about – it was a busy week. File this one under technology that is making our interactions more natural. (NUI for those that are keeping up).

I got to play with this one and it was indeed very natural. Most impressive was how it figured out I wanted to erase something, simply by the way I held the object. The project could do with a more snappy name though – Recognizing Pen Grips for Natural User Interaction isn’t too natural to say Smile

How does this work? By using sensors – multitouch and orientation sensors specifically. Hands up, where do you see this being used? Games, art, CAD? Others?