Wildfire and smoke early detection for drone applications: a light-weight deep learning approach

Date published

2024-10-01

Free to read from

2024-08-12

Supervisor/s

Journal Title

Journal ISSN

Volume Title

Publisher

Elsevier

Department

Type

Article

ISSN

0952-1976

Format

Citation

Kumar A, Perrusquía A, Al-Rubaye S, Guo W. (2024) Wildfire and smoke early detection for drone applications: a light-weight deep learning approach. Engineering Applications of Artificial Intelligence, Volume 136, Part B, October 2024, Article number 108977

Abstract

Drones have become a crucial element in current wildfire and smoke detection applications. Several deep learning architectures have been developed to detect fire and smoke using either colour-based methodologies or semantic segmentation techniques with impressive results. However, the computational demands of these models reduce their usability on memory-restricted devices such as drones. To overcome this memory constraint whilst maintaining the high detection capabilities of deep learning models, this paper proposes two lightweight architectures for fire and smoke detection in forest environments. The approaches use the Deeplabv3+ architecture for image segmentation as baseline. The novelty lies in the incorporation of vision transformers and a lightweight convolutional neural network architecture that heavily reduces the model complexity, whilst maintaining state-of-the-art performance. Two datasets for fire and smoke segmentation, based on the Corsican, FLAME, SMOKE5K, and AI-For-Mankind datasets, are created to cover different real-world scenarios of wildfire to produce models with better detection capabilities. Experiments are conducted to show the benefits of the proposed approach and its relevance in current drone-based wildfire detection applications.

Description

Software Description

Software Language

Github

Keywords

Deeplabv3+, Mobile vision transformers, Mobilenet, Wildfire and smoke detection, Segmentation

DOI

Rights

Attribution 4.0 International

Relationships

Relationships

Supplements

Funder/s