Pose-informed deep learning method for SAR ATR

Date

2020-03-30

Supervisor/s

Journal Title

Journal ISSN

Volume Title

Publisher

The institution of Engineering and Technology (IET)

Department

Type

Article

ISSN

1751-8784

Format

Free to read from

Citation

Belloni C, Aouf N, Balleri A, et al., (2020) Pose-informed deep learning method for SAR ATR. IET Radar Sonar and Navigation, Volume 14, Issue 11, November 2020, pp. 1649-1658

Abstract

Synthetic aperture radar (SAR) images for automatic target classification (automatic target recognition (ATR)) have attracted significant interest as they can be acquired day and night under a wide range of weather conditions. However, SAR images can be time consuming to analyse, even for experts. ATR can alleviate this burden and deep learning is an attractive solution. A new deep learning Pose-informed architecture solution, that takes into account the impact of target orientation on the SAR image as the scatterers configuration changes, is proposed. The classification is achieved in two stages. First, the orientation of the target is determined using a Hough transform and a convolutional neural network (CNN). Then, classification is achieved with a CNN specifically trained on targets with similar orientations to the target under test. The networks are trained with translation and SAR-specific data augmentation. The proposed Pose-informed deep network architecture was successfully tested on the Military Ground Target Dataset (MGTD) and the Moving and Stationary Target Acquisition and Recognition (MSTAR) datasets. Results show the proposed solution outperformed standard AlexNets on the MGTD, MSTAR extended operating condition (EOC)1, EOC2 and standard operating condition (SOC)10 datasets with a score of 99.13% on the MSTAR SOC10.

Description

Software Description

Software Language

Github

Keywords

radar target recognition, radar imaging, learning (artificial intelligence), synthetic aperture radar, image recognition, Hough transforms, neural nets, image classification

DOI

Rights

Attribution-NonCommercial 4.0 International

Relationships

Relationships

Supplements

Funder/s