Google Street View eat your heart out: An MIT-built quadrocopter uses Microsoft Kinect, and some smart odometry algorithms, to fly around a room and spit out a 3D map of the environment.
The drone itself is a lightweight UAV with four rotors, an antenna, protective and stabilising shielding and the guts of a Kinect sensor. It communicates with a nearby laptop, but all sensing and computation to determine position is done onboard an internal computer.
The quadrocopter runs an MIT-developed real-time visual odometry algorithm to work out location and velocity estimates, which helps stabilise the vehicle during its fully autonomous 3D flight. Most small drones use GPS or pre-coded information about the area to avoid bumping into things. This little vehicle does all those calculations on the fly.
Odometry is the process of using data from some kind of sensor to figure out position, like measuring how far a robot’s legs or wheels have moved to determine how far it has travelled. In this case, the algorithm looks at successive frames and depth estimates from the Kinect, and matches features across the images to calculate distance and stability.
The data is also simultaneously sent off to a nearby computer, which collates the colour photos and location data to build a neat 3D model of the space. The MIT team collaborated with Peter Henry and Mike Krainin from the robotics and state estimation lab at the University of Washington to use their RGBD-SLAM algorithms.
Take a look or should I say a watch of the video, we found this post over at wired.co.uk pretty cool and worthy of sharing…