Mapping agricultural land in support of opium monitoring in Afghanistan with Convolutional Neural Networks (CNNs).

Date

2021-12

Journal Title

Journal ISSN

Volume Title

Publisher

Cranfield University

Department

SWEE

Type

Thesis or dissertation

ISSN

Format

Free to read from

Citation

Abstract

This work investigates the use of advanced image classification techniques for improving the accuracy and efficiency in determining agricultural areas from satellite images. The United Nations Office on Drugs and Crime (UNODC) need to accurately delineate the potential area under opium cultivation as part of their opium monitoring programme in Afghanistan. They currently use unsupervised image classification, but this is unable to separate some areas of agriculture from natural vegetation and requires time-consuming manual editing. This is a significant task as each image must be classified and interpreted separately. The aim of this research is to derive information about annual changes in land-use related to opium cultivation using convolutional neural networks with Earth observation data. Supervised machine learning techniques were investigated for agricultural land classification using training data from existing manual interpretations. Although pixel-based machine learning techniques achieved high overall classification accuracy (89%) they had difficulty separating between agriculture and natural vegetation at some locations. Convolutional Neural Networks (CNNs) have achieved ground-breaking performance in computer vision applications. They use localised image features and offer transfer learning to overcome the limitations of pixel-based methods. There are challenges related to training CNNs for land cover classification because of underlying radiometric and temporal variations in satellite image datasets. Optimisation of CNNs with a targeted sampling strategy focused on areas of known confusion (agricultural boundaries and natural vegetation). The results showed an improved overall classification accuracy of +6%. Localised differences in agricultural mapping were identified using a new tool called ‘localised intersection over union’. This provides greater insight than commonly used assessment techniques (overall accuracy and kappa statistic), that are not suitable for comparing smaller differences in mapping accuracy. A generalised fully convolutional model (FCN) was developed and evaluated using six years of data and transfer learning. Image datasets were standardised across image dates and different sensors (DMC, Landsat, and Sentinel-2), achieving high classification accuracy (up to 95%) with no additional training. Further fine-tuning with minimal training data and a targeted training strategy further increased model performance between years (up to +5%). The annual changes in agricultural area from 2010 to 2019 were mapped using the generalised FCN model in Helmand Province, Afghanistan. This provided new insight into the expansion of agriculture into marginal areas in response to counter-narcotic and alternative livelihoods policy. New areas of cultivation were found to contribute to the expansion of opium cultivation in Helmand Province. The approach demonstrates the use of FCNs for fully automated land cover classification. They are fast and efficient, can be used to classify satellite imagery from different sensors and can be continually refined using transfer learning. The proposed method overcomes the manual effort associated with mapping agricultural areas within the opium survey while improving accuracy. These findings have wider implications for improving land cover classification using legacy data on scalable cloud-based platforms.

Description

Software Description

Software Language

Github

Keywords

fully convolutional networks, convolutional neural networks, transfer learning, remote sensing, image classification, opium cultivation

DOI

Rights

© Cranfield University, 2021. All rights reserved. No part of this publication may be reproduced without the written permission of the copyright holder.

Relationships

Relationships

Supplements

Funder/s