California Wildfires (Part 2): How 360 drone panoramics become a first-responder tool

By Greg Crutsinger, Drone Scholars

I had some time to think in between the two dozen drone flights it took to map the Coffey Park neighbourhood in Santa Rosa, one of the hardest hit by the recent wildfires in California (see Part 1). As I stood in front of the burned-out homes and cars in one of many cul-de-sacs in the neighbourhood, I couldn’t begin to imagine the heartbreak the residents must be feeling. Thousands of families lost everything in an instant. And this was one of many neighbourhoods impacted by multiple fires burning across the state. Tens of thousands of people were still evacuated, anxiously awaiting news of their properties.

I tried to focus on the task at hand of collecting and processing thousands of photos into high-resolution maps. I knew it was going to be a lot of work and would take several days to finish. I was contemplating easier, faster ways to capture high-resolution, georeferenced drone data. Perhaps something in between single snapshot photos or video flyovers and the high-resolution maps we were collecting.

I should preface that my background isn’t public safety or even as a drone service provider. I was just volunteering to assist a UAV team from the local sheriff’s department that was supporting the county in their management efforts. They were doing all the flying and I was helping to coordinate the missions for post-processing. My background is in actually in science.

So, like a scientist, I ran a simple experiment.

I had one of the officers download the ‘Hangar 360 for DJI’ app. This is a free app from the Hangar team in Austin, TX that is compatible with most DJI drones and iOS devices. The app uses a one-button push to collect around two dozen photos at different gimbal angles and as the drone yaws in circles. When the copter lands, the user transfers the images to their phone and then to the Hangar cloud for stitching into the 360 panoramic. In about 20-30 minutes, a web link is made available from the cloud that can be shared with anyone. The method is simple. The outputs are both intuitive for scene awareness and compelling.

Below is the very first 360 panoramic at Coffey Park. I think the results speak for themselves as to the scale of the fire and the heavy losses in the Santa Rosa community.

https://world.hangar.com/three-sixty/BY1ooe4r

After seeing the initial results, the team collected several more panoramics throughout the Coffey Park neighbourhood. The approach was easy to train others and took a fraction of the time compared to the full-scale mapping we were just finishing. The following day, another UAV team started collected panoramics in other impacted neighbourhoods. The experiment was spreading organically, a testament to its simplicity.

Meanwhile, the Hangar team back in Austin was working to put the panoramic locations into a more user-friendly format. To the credit of Hangar, this was a significant (voluntary) effort by their team to assist in the fire and was turned around in ~ 24 hrs.  Here are the results from the Fountain Grove neighbourhood in Santa Rosa.

Click on the red and grey Hangar logo in upper left corner to reveal the inset map.

https://view.hangar.com/Hangar-Team/Alameda-County/index.html

Note: All the panoramic imagery shown has been made publically available by Sonoma County here.

I find the user experience of the Hangar platform to be simple and clean, including the ability to manually rotate the panoramic in 360 degrees or let it slowly autorotate. You can zoom in on both the image and the map with pins dropped for each location. The field-of-view (FOV) is displayed as a cone that rotates when the panoramic moves, providing a better frame of reference to the pointing direction within the imagery. The FOV cone narrows or widens as you zoom on the image.  

It is easy to imagine how such a map could be populated by multiple teams from different agencies coordinating together with off-the-shelf drones and their smartphones. Results could be shared in near real time. Perhaps the ability to annotate the panoramics themselves could be added. It would be useful to dropping pins or notes in specific locations within the images themselves to improve search and clear efforts, for example.

I know that the end results of this experiment are not machine learning, artificial intelligence, swarming behaviour, or any other hot topic at the moment. Some might point out all of the issues to sort out with public safety use, data security, and sharing. Others might say that the high-resolution maps are still incredibly valuable. Other startups may have their own solutions. Fair enough.

In my opinion (scientific, not disaster expert), this was one successful experiment was born under the necessity to share data quickly during California wildfires. It shows a proof-of-concept method that is quick, effective, and ready to use for the next [insert disaster here]. My opinions are my own and do not reflect those of the agencies involved.

The next step is to scale the experiment more broadly and under less stringent flying conditions than a disaster zone. Therefore, I launched a bigger experiment called the Fly4Fall Campaign in partnership with a range of universities, other private companies, and volunteer citizens. The goal of the Fly4Fall is to carry out a survey of autumn plants everywhere and reveal the power of what the drone community can do. Volunteer drone pilots across the world can help. Go to www.Fly4Fall.com to learn more.

From fire to foliage, perhaps the simplest approaches can be the most effective.

Patrick talks to Greg, here.