Microsoft wants to interrogate your mistakes. Arriving on GitHub today AirSim a free drone simulator that learns with you. Feeding back meatware control moves into robot control Artificial Intelligence algorithms.
Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles. For this purpose, AirSim also exposes APIs to retrieve data and control vehicles in a platform independent way.
There are two ways you can generate training data from AirSim for deep learning. The easiest way is to simply press the record button on the lower right corner. This will start writing pose and images for each frame.
If you would like more data logging capabilities and other features, file a feature request or contribute changes. The data logging code is pretty simple, and you can modify it to your heart’s desire.
A more complex way to generate training data is by writing client code that uses our APIs. This allows you to be in full control of how, what, where and when you want to log data. See the next section for more details.
The AirSim exposes easy to use APIs to retrieve data from the drones that include ground truth, sensor data as well as various images. It also exposes APIs to control the drones in a platform independent way. This allows you to use your code to control different drones platforms, for example, Pixhawk or DJI Matrice, without making changes as well as without having to learn internal protocols details.
These APIs are also available as a part of a separate independent cross-platform library so you can deploy them on an offboard computer on your vehicle. This way you can write and test your code in the simulator and later execute it on the real drones. Transfer learning and related research are one of our focus areas.
All way above my paygrade there is a paper here that explains it all.