Taking natural interaction with computers to the next level

Imagine a future when we’ll do more than touch, swipe and tap away at our interactive work stations, but will communicate with our devices increasingly through voice and gesture.

In meeting rooms, massive touchscreens will let multiple people interact simultaneously, while tracking who and where we are, what we’re doing and saying, and adapting accordingly.

This is the new NUI (natural user interface), and the Holy Grail for those who believe that computing will someday be as natural as breathing.

At the Microsoft Envisioning Center, Kinect for Windows and Perceptive Pixel play a huge role in this effort. You might already be familiar with Perceptive Pixel, which makes the large, multi-touch and pen devices common to election coverage on CNN, for example. (Microsoft acquired Perceptive Pixel in 2012.)

Learn how The Envisioning Center team is using these technologies to take NUI — and depth sensing, facial recognition, voice and audio capabilities across all devices — to the next level.

You might also be interested in:

· Learning at the edge of chaos
· Let’s HereHere it for this New York City neighborhood data project from Microsoft Research
· The kids are (still) alright

Aimee Riordan
Microsoft News Center Staff