Blog

On being natural

Perceptual computing and the natural user interface Intel didn’t invent natural user interfaces, but they sure as hell are going to make them come to life—so to speak. And to help make that happen, the company is putting its money where its, well, where its everything is. Intel has created a $100 million fund for perceptual computing development. As you ...

Robert Dow

Perceptual computing and the natural user interface

Intel didn’t invent natural user interfaces, but they sure as hell are going to make them come to life—so to speak. And to help make that happen, the company is putting its money where its, well, where its everything is. Intel has created a $100 million fund for perceptual computing development. As you probably (should) know, perceptual computing consists of the next generation of user interfaces such as gesture, image recognition, touch, voice, and even emotion sensing. These are new areas, involving new sensors, on a variety of platforms. As a result, there aren’t many developers out there—yet. … Intel wants to change that, and is offering grants and investment, and other helpful items like hardware and engineers, to attract developers to come up with clever ideas and get rich. Be inventive, take Intel’s money, make even more. 

The scenarios are limitless, and that may be part of the problem, a bit overwhelming

Imagine being able to engage with computers, in your office, car, home, at stores, jogging, flying, everywhere, and get what you want, when you want it, almost instantly. That’s the promise of natural interfaces.
Envision Star Trek—“Computer,” you say in a declarative voice, so it knows you’re talking to it and not about it, “Coffee, no wait, caffe latte, one sugar. Oh, hell, just make it the way I like it.” And as long as it takes to steam the milk, your caffe latte is there.

Remember the scene in Minority Report when Tom Cruise walks into a department store after having stolen the eyes from someone and the store says, “Hello, Mr. Smith, do you want to look at plaid shirts?” Philip K. Dick saw that vision of natural user interface in 1956—almost 60 years ago—and we’re just getting to the point where we can do it.

Today we have so much compute power, in such small packages, we almost don’t know what to do with it all. In fact, that’s the problem—we don’t know what to do, or rather how to do it.

Intel, led by the affable, and always amusing, Mooly Eden, president of Intel Israel, is heading up the Perceptual Computing initiative at Intel, and if you have an idea on how to make a computer a really helpful companion, he wants to hear from you—like yesterday. Eden has approximately 100 people already inside Intel working on Perceptual Computing technologies.

Beyond touch

However, I don’t think natural interfaces will, or should be, the way they’re being presented right now: touch, voice, image. I think it will be, should be, just voice and image. After all, when I go to the store, I don’t touch the sales clerk, do I? I think the challenge is in the image processing, seeing me, seeing how I feel, how I look, and anticipating my mood, my problem. And I believe that like voice recognition, that information will be a pull from a database, a huge database, of collected and tagged images of millions of people. Dragon has been doing this for years. They offer a free voice recognition app for your phone. Why? So they can collect all of our speech patterns.

INTEL’S MOOLY EDEN might look good in plaid.But that will take time, and so in the meantime we’ll have to use the intermediate interface of touch, and maybe gesture—it’s the middle ground between keyboard and mouse, and real visual and voice perception. But we won’t necessarily have to look at or speak to a thing; real perceptional computing will be with us all the time, in our clothes, shoes, eyeglasses, earrings, and belt. We will be the Internet of things, and the sensors and processors and radios

 

woven (literally) into our lives and clothes will communicate with the rest of the internetosphere.

We won’t have to steal someone’s eyes to have a store recognize us when we walk in; the store will have been notified while our autonomous driving car is dropping us off in front of the store. And the class of the store will be measured by how we interact with it—does it have people or just robots? High-end stores will have real people, Walmarts will have robots, not the Robby the Robot or the I Robot kind, just machines that dispense what we want and tell us to have a nice day and automatically remove money from our account to pay for the plaid shirt our car told the store we would be looking for. And our car knew that because the home computer (which we don’t even know where it is anymore) heard me tell my wife I was going to go look at shirts, and my wardrobe computer told the car that my favorite plaid shirt had a stain on it and I was depressed about it. That’s my perception of perceptual computing, that and seeing Mooly in a plaid shirt.

You can read a free online copy of Dick’s story at http://cwanderson.org/wp-content/uploads