How long does an insurance claim take after a wildfire ravages your home?
How many police, emergency crews and council teams have to comb through the rubble and take pictures and document the tragic events? Communities and insurance companies need anywhere between 6–9 months and thousands of experts and volunteers to get a precise overview of the impact.
At Unleash live, we asked ourselves, what if it could be done quicker?
When wild fires passed through Northern California’s town of Paradise in early November, Unleash live wanted to help early responders to gain operational clarity fast to assist in recovery efforts and accelerate the process for homeowners.
Through the fantastic work of Butte County together with DJI, Dronedeploy, Survae and Scholar Farms the county assembled a high resolution dataset of the affected area.
With the Camp dataset, Unleash live analysed 17,000 acres of high resolution imagery of Paradise providing valuable insight such as the count, gps location and square feet of the affected structures, the location and count of intact structures, count of vehicles damaged.
Unleash live achieves this outcome in near to real time. Drones flying, data ingesting through WiFi/4G network or satellite (with post flight upload as a fallback option), A.I. video analytics being applied in flight, and actionable insight returned on mobile, tablet on the ground fast.
With Unleash live’s ability to integrate other GIS and sensor data we are quickly able to extend the insights generated even further.
Here is the story of how we did it.
Training a high performing custom A.I. for damage detection
Through our partnership with DJI Enterprise and Emergency Services, we trained a custom A.I. algorithm, which identifies property affected by fire in live geo-referenced video feeds or geo-referenced images.
To build this A.I., various police and emergency services uploaded high resolution datasets to Unleash live in early 2018. With our A.I. cloud server infrastructure, powered by high performance Nvidia GPUs and AWS, we were able to train a neural network and deploy it to our A.I. sandbox within a few weeks.
Early tests with emergency crews on the ground, streaming live video back to Unleash live, proved an accuracy of more than 80% of inference. We then also tested the A.I. on fire footage from other areas, such as the Santa Rosa fires and the fires around Athens in Greece. These early tests were very encouraging for the team. The A.I. could be extended across any future large scale fire.
Extracting insights from the Camp Fire dataset
Once we had access to the Camp fire high resolution imagery shot by DJI drones, we selected several large test areas and deployed our A.I. algorithm.
We ran the A.I. against various subset locations to assess the accuracy. Here are a couple examples of our first pass output.
Within minutes the Unleash live Fusion Atlas presents the output. The affected homes are marked with pins on a layered map in any browser or tablet. This gives operational clarity to responders as well as utilities, councils and families.
Others may want to obtain a visual comparison of before and after as well as A.I. overlays. All can be done through our difference detection tools.
Learnings and Next Steps
- Like any other A.I., this A.I. still returns false positives. Sometimes scrub or burnt trees are incorrectly identified as burnt structures. State of the art A.I. models for detection in complex scenery, typically have an accuracy of 80%-90%. So at least one in ten objects will be identified incorrectly. We need to accept this.
- A.I. helps with large and dispersed quantity of raw data, it is not an engineering tool. It excels with getting a grasp of a complex situation fast, to focus the human on the most important aspects and remove the hazardous, boring and routine elements.
- A.I. will never replace a human fully. This A.I. was trained for only 2 weeks from different heights and lighting conditions on the Carr fire and then transferred to Camp Fire. It holds up very well considering and guarantees above 80% inference.
- Aerial data is different from street level data. Unleash live ingests footage from cameras on the ground as well as from drones and satellites. This provides developers and enterprises the flexibility to customise their A.I. models to their specific need. Identifying a burnt car at street level requires a different data set vs recognising the car from an aerial or satellite perspective.
At Unleash live, together with our partners, we are now working to overcome these challenges with our unique video A.I. analytics capability and further refinement of our A.I. development work. In the case of video, not every single frame needs to id the object. A good fix within say every 1 second is sufficient to push the overall inference accuracy. All this will help extend our A.I. analysis to many further enterprise use cases.
Here is an example of how the Unleash live video A.I. performs on fire damage detection.
The work to help emergency services and councils worldwide is extremely exciting for our team, we invite everyone to contribute to make this capability even more valuable. Tell us what you would do to help better decision making in vast geographically dispersed areas.