START TYPING KEYWORDS TO SEARCH OUR WEBSITE

The data’s there, why not use it?

Posted: 08.17.10
Source: Hplusmagazine

Every day we (or at least I) read about an augmented reality (AR) application or installation somewhere in the world. Maybe I’m just sensitive to the topic since it fulfills one of my fantasies about a sci-fi singularity world I can’t wait to enjoy.

The idea of location-based information flowing to me based on a preferences algorithmic learning program (like the Pandora digital radio application) powered by gigantic cloud-based processors and delivered to my very smartphone (which knows who I am, and where I am) is so appealing I simply can’t wait.

Why should I wait? Do I really have to? In our new Audi we can tell the nav system to show gas stations, ATMs, clothing stores, or parking lots, and more. The data is there in the cloud. We all helped put it there either by voting with our presence and Euros/Yen/Dollars, or by unconscious selection with click-throughs. You can even have an app that will locate your phone—that’s the ultimate irony—find the finder. (And yes there have been stalking cases where that app has been misused.)

You can find shops for food, banks, pubs, and bicycle shops in the London underground, or pretzel stands and bookshops in the New York subway, or other trains in Tokyo’s underground. You can buy that data, and a lot of it is free—those merchants and underground train operators want you to know where they are. So the data is there, and a few enterprising folks in various countries are wedding it to the camera in your pretty-smart phone.

You’ve seen the demo—you turn on the camera, point it at something—a street, the ground, a building and overlays about the stuff in the area float in the image.

(If you haven’t seen them, go here: http://mashable.com/2009/08/19/augmented-reality-apps/—and look at Layar and others.)

With an app like Nearyou you can point at things to get data about them, or find places, and then you can lay your phone flat in your hand (parallel to the ground) and see a 2D map with directions to whatever you’re looking for. So Zagat will find restaurants for you and allow you to make reservations—and you can do it with voice recognition—how’s that for the ultimate smart companion?

And you can play games like AR Labyrinth which creates a 3D maze on your phone that you wander through (you have to be in a large open area like a field, wouldn’t want you walking into people, walls or buses.)

So we’re getting really close. Right now the AR apps are all visual. Yet we walk around most of the time with ear buds in, especially in crowded transportation systems like airports and undergrounds. What about audio AR? In addition to the alerting aspects of merchandising, think of the aid it would provide for people with sight impairment, or age disorders (including forgetting where the car is parked) How about a simple alert (because after all we wouldn’t want Lady Gaga interrupted) that said, “Better check the map—NOW! Prada alert.”

And if the getting smarter phone can be tied into the semi-smart car’s audio system for phone calls, why couldn’t it cooperate with the car’s GPS and database, and use its visual system too?

Point and see what’s available. (Source: TwitARound)

And how about AR-virtual video conferences? I’m in my pajamas, need a shave and my hair hasn’t felt a comb or brush in 24 hours and you want to have a video conference? I don’t think so. But just a minute, let me send my virtual avatar to you. VenueGen has a system that tries to do this. It’s too bandwidth dependent right now but shows promise.

I’m super excited about the LTE bandwidth coming, combined with the gazillions of MIPS available in the cloud, our linkage to virtual worlds that are presented in AR on our super-smart phones is clearly how we get to the singularity. Can we make this train go faster please?