Automatic quantification of settlement damage using deep learning of satellite images

Date published

2021-10-15

Free to read from

Supervisor/s

Journal Title

Journal ISSN

Volume Title

Publisher

IEEE

Department

Type

Conference paper

ISSN

2687-8860

Format

Citation

Lu L, Guo W. (2021) Automatic quantification of settlement damage using deep learning of satellite images. In: 2021 IEEE international Smart Cities Conference (ISC2), 7-10 September 2021, Virtual Event

Abstract

Humanitarian disasters and political violence cause significant damage to our living space. The reparation cost to homes, infrastructure, and the ecosystem is often difficult to quantify in real-time. Real-time quantification is critical to both informing relief operations, but also planning ahead for rebuilding. Here, we use satellite images before and after major crisis around the world to train a robust baseline Residual Network (ResNet) and a disaster quantification Pyramid Scene Parsing Network (PSPNet). ResNet offers robustness to poor image quality and can identify areas of destruction with high accuracy (92 %), whereas PSPNet offers contextualised quantification of built environment damage with good accuracy (84%). As there are multiple damage dimensions to consider (e.g. economic loss and fatalities), we fit a multi-linear regression model to quantify the overall damage. To validate our combined system of deep learning and regression modeling, we successfully match our prediction to the ongoing recovery in the 2020 Beirut port explosion. These innovations provide a better quantification of overall disaster magnitude and inform intelligent humanitarian systems of unfolding disasters.

Description

Software Description

Software Language

Github

Keywords

Deep learning, Image quality, Technological innovation, Satellites, Smart cities, Predictive models, Real-time systems

DOI

Rights

Attribution-NonCommercial 4.0 International

Relationships

Relationships

Supplements

Funder/s