MIT SLAM quad using Kinect

Point cloud derived from kinect sensor

MIT drone makes a map of a room in real time using an X Box Kinect and is able to navigate through it. All calculations performed on board the multicopter.

Students have developed a real-time visual odometry system that can use a Kinect to provide fast and accurate estimates of a vehicle’s 3D trajectory. This system is based on recent advances in visual odometry research, and combines a number of ideas from the state-of-the-art algorithms. It aligns successive camera frames by matching features across images, and uses the Kinect-derived depth estimates to determine the camera’s motion.

They have integrated the visual odometry into their Quadrotor system, which was previously developed for controlling the vehicle with laser scan-matching. The visual odometry runs in real-time, onboard the vehicle, and its estimates have low enough delay that they are successfully able to control the quadrotor using only the Kinect and onboard IMU, enabling fully autonomous 3D flight in unknown GPS-denied environments. Notably, it does not require a motion capture system or other external sensors – all sensing and computation required for local position control is done onboard the vehicle.

More here

 

1 comment for “MIT SLAM quad using Kinect

  1. Spencer
    1 April 2011 at 6:52 am

    Now they can have Google Street View, AND Google indoor-view!

Comments are closed.