Thursday, 20 February 2014

Google Launches Project Tango Smartphone To Experiment With Computer Vision And 3D Sensors

Source: TechCrunch
Google today announced Project Tango, an Android-based prototype 5″ phone and developer kit with advanced 3D sensors out of its Advanced Technology and Projects (ATAP) hardware skunkworks group.
Using its sensors, the phone doesn’t just track motion, but it can actually build a visual map of rooms using 3D scanning. The company believes the combination of these sensors with advanced computer vision techniques will open up new avenues for indoor navigation and immersive gaming, among many other things.
Starting today, Google will allow developers to sign up for access to these phones, but the first run will be limited to a hand-vetted group of 200 developers. Developers will have to provide Google with a clear idea of what they want to build with the device and the company expects to allocate all devices by March 14th, 2014. It will allocate the devices to developers who want to built apps for “indoor navigation/mapping, single/multiplayer games that use physical space, and new algorithms for processing sensor data.”
Developers will be able to write apps in Java, C/C++ and with the help of the Unity Game Engines. The company notes that the APIs for the phone remain a work in progress.
“Project Tango strives to give mobile devices a human-like understanding of space and motion through advanced sensor fusion and computer vision, enabling new and enhanced types of user experiences – including 3D scanning, indoor navigation and immersive gaming,” said Johnny Lee, ATAP’s technical program lead.
The idea behind Project Tango is to see what kind of applications developers will dream up for this technology. Google hopes that it can unlock new kinds of smart, vision-based applications based on the 3D sensing and vision technology that it has built into the phone. By giving applications an almost human-like understanding of space, developers will be able to create applications that simply weren’t possible before.
The phones are outfitted with a compass and gyros, just like any other phone, but in addition, they feature Kinect-like visual sensors that can scan the room around the phone.
It’s worth noting that the idea here isn’t to create Leap Motion-like, gesture-based interfaces. It’s about how the apps developers can create when they know exactly where a phone is in space.
In its announcement, Google asks: “What if you could capture the dimensions of your home simply by walking around with your phone before you went furniture shopping? What if directions to a new location didn’t stop at the street address? What if you never again found yourself lost in a new building?”
The project was headed up by Lee, who previously worked on Microsoft’s Kinect technology before he left for Google in early 2011. Today’s announcement also marks the first public hardware release from Google’s ATAP group, which was one of the few units of Motorola the company decided to keep, even as it is selling off the rest of the company.

bg4

Popular Posts