From the dawn of humanity we’ve needed to know where we are and how to get to where we want to go. Since the Phoenicians 3,000 years ago, mariners steered by the sun and the stars. Heavenly bodies have been the basis for navigation ever since. From its invention in the 1700s, the sextant, an instrument for navigating by determining the angle between the horizon and a celestial body, has been carried on ships, on jetliners (early Boeing 747s had a sextant view port in the cockpit’s roof), and on Apollo spacecraft. Today constellations of positioning satellites (GPS, GLONASS, Galileo) help us find our way. So, too, for drones, which rely on positioning satellites like GPS, along with a compass and barometer, to know where they are.
Today there is an exciting new technology for drone navigation, one that looks toward the ground, not the sky: visual inertial odometry (VIO). It’s the first step on the road toward autonomous flight. And it’s here now, in the DBUS2.
The building blocks for the DBUS2’s VIO system are a single 160° ultra wide-angle camera, inertial motion sensors, and a lot of sophisticated software.
The camera depicts objects in scale relative to each other but not in absolute scale such as their size in meters or feet. Inertial sensors do measure absolute scale but they are subject to drift as small positioning errors at the outset of travel become magnified over time. The VIO algorithms compensate for these, using scale from the inertial sensors and positioning from the camera to tell the drone where it is in relation to its environment.
For the DBUS2, VIO enables it to hover very accurately, within +/- 0.1 m (3.8 in.), maintain its position and fly back to its hover point even if it’s been pulled away. The DBUS2’s Memory Cruise feature uses VIO to store the coordinates of where it is. The drone can play back the coordinates to repeat a flight. It can also play them in reverse, enabling the drone to fly back home no matter how windy or narrow the route. Memory Cruise can record even when the motors are off, enabling you to hold the aircraft while you walk with it in order to memorize a flight path.
VIO gives you unprecedented control of your drone for exciting new kinds of flights, impressive video from vantage points you couldn’t get any other way, and loads of fun.
Beyond just fun, VIO has even greater potential for the economy. Freeing drones to fly autonomously could let drones do tasks they couldn’t when guided by human pilots.
Today one pilot controls one drone. The Federal Aviation Administration has predicted there will be 7 million drones active in the skies over our heads by 2020. In the not too distant future drone flights could reach much higher numbers. UPS and FedEx alone deliver about 20 million packages daily. In just one major city, Tokyo, more than 1 million people take a taxi each day. For drones to handle even a fraction of this volume, drone delivery and air taxi flights would have to increase dramatically.
If drone flights reach these numbers there will be way too many flights for human pilots to manage. But if drones could fly themselves—meaning they could see things like a safe path among buildings, optimum landing spots, and how to navigate without GPS inside structures—they wouldn’t need human
intervention for every minute of their flight. Drones would be able to handle all kinds of aerial tasks on their own, including those yet undreamt of.
Here’s How VIO Works
VIO systems use a camera to take an image of what it sees ahead and below. For selected frames, image processing software builds a point cloud (a set of data points in space) around the edges of each meaningful shape in the frame, marking where it is in the frame.
At the same time, VIO uses the drone’s inertial motion unit to track its movement in space, such as up or down, back or forward, left or right. These exquisitely sensitive sensors know just where the drone is relative to itself and its travel millisecond to millisecond.
The VIO system then processes another image. Now we have two images, with the objects in each frame grouped together relative to each other but shifted left or right, up or down because of the different camera angle given the drone has moved.
What happens next is like magic. VIO algorithms use the elapsed time between two images, the differences in the drone’s position relative to itself and its travel, and the differences between how the group of objects in the two images has shifted, to figure out where the drone is in relation to its environment.
It’s a whole lot of math that keeps the DBUS2 remarkably on course: less than 5% drift over a 300 m (984 ft.) flight path. A commercial airliner does considerably better—1/10th of 1% over 500 miles—but its more elaborate and far more costly inertial navigation systems benefit from being cross-checked against GPS.
GPS data is on the horizon for VIO in small drones as well as awareness of restricted airspace such as no-fly zones and potentially integration with terrain databases and with LAANC (a system for obtaining permission to fly near airports).
As VIO evolves, expect it to recognize textures better, along with fine movements of objects it sees: elasticity, flows, subtle changes in shapes and reflectivity (much the same way Polynesians used sea currents to navigate on the open ocean).
Along with collision avoidance (via VIO, LIDAR, sonar and/or radar) and data from magnetic sensors and radio navigation, this is a formidable suite of smarts to enable drones to fly on their own, whether it’s delivering a package, saving a life, inspecting a bridge or any number of tasks.
Or, with the DBUS2, just for fun in your backyard or wherever your travels may take you.