MIT SLAM quad using Kinect

Point cloud derived from kinect sensor

MIT drone makes a map of a room in real time using an X Box Kinect and is able to navigate through it. All calculations performed on board the multicopter.

Students have developed a real-time visual odometry system that can use a Kinect to provide fast and accurate estimates of a vehicle’s 3D trajectory. This system is based on recent advances in visual odometry research, and combines a number of ideas from the state-of-the-art algorithms. It aligns successive camera frames by matching features across images, and uses the Kinect-derived depth estimates to determine the camera’s motion.

They have integrated the visual odometry into their Quadrotor system, which was previously developed for controlling the vehicle with laser scan-matching. The visual odometry runs in real-time, onboard the vehicle, and its estimates have low enough delay that they are successfully able to control the quadrotor using only the Kinect and onboard IMU, enabling fully autonomous 3D flight in unknown GPS-denied environments. Notably, it does not require a motion capture system or other external sensors — all sensing and computation required for local position control is done onboard the vehicle.

More here

 

Gary Mortimer

Founder and Editor of sUAS News | Gary Mortimer has been a commercial balloon pilot for 25 years and also flies full-size helicopters. Prior to that, he made tea and coffee in air traffic control towers across the UK as a member of the Royal Air Force.