Drone learns to avoid trees with a little help from a friend.

The BIRD Multi University Research Initiative project envisions getting mini Unmanned Aerial Vehicles to autonomously navigate through densely cluttered environments, like forests, autonomously.

Towards this end, CMU is working on reactive controllers and receding horizon control.

Reactive Controller using DAgger

We use imitation learning to train the drone to learn the expert’s control inputs iteratively; we evaluate a number of optical features from the image stream and then perform a linear ridge regression on the feature vectors over the control inputs. What this achieves is that the generated controller learns to correlate specific changes in visual features with a particular control input (In our case, a roll left or right). For instance, considering optical flow, a tree closer to the camera image would move faster than those further away, and then as the expert avoids the tree by moving sideways, the controller would learn to associate that specific change in optical flow to a command to evade it right or left.

After the first few flights with the expert in control, we generate a preliminary controller and start flying the drone with only the controller commanding the drone. The operator then provides his/her expert input based on the image stream and then a new controller is generated. This process continues till we obtain a satisfactory controller that has visited sufficient states to be able to avoid trees on a consistent basis. For a more rigorous discussion, we recommend reading our paper

Here’s a video of the system in action:

http://robotwhisperer.org/bird-muri/

Press