DeepLO: Multi-projection deep LIDAR odometry for space orbital robotics rendezvous relative navigation

Date

2020-07-30

Supervisor/s

Journal Title

Journal ISSN

Volume Title

Publisher

Elsevier

Department

Type

Article

ISSN

0094-5765

Format

Citation

Kechagias-Stamatis O, Aouf N, Dubanchet V, Richardson M. (2020) DeepLO: Multi-projection deep LIDAR odometry for space orbital robotics rendezvous relative navigation. Acta Astronautica, Volume 177, December 2020, pp. 270-285

Abstract

This work proposes a new Light Detection and Ranging (LIDAR) based navigation architecture that is appropriate for uncooperative relative robotic space navigation applications. In contrast to current solutions that exploit 3D LIDAR data, our architecture suggests a Deep Recurrent Convolutional Neural Network (DRCNN) that exploits multi-projected imagery of the acquired 3D LIDAR data. Advantages of the proposed DRCNN are; an effective feature representation facilitated by the Convolutional Neural Network module within DRCNN, a robust modeling of the navigation dynamics due to the Recurrent Neural Network incorporated in the DRCNN, and a low processing time. Our trials evaluate several current state-of-the-art space navigation methods on various simulated but credible scenarios that involve a satellite model developed by Thales Alenia Space (France). Additionally, we evaluate real satellite LIDAR data acquired in our lab. Results demonstrate that the proposed architecture, although trained solely on simulated data, is highly adaptable and is more appealing compared to current algorithms on both simulated and real LIDAR data scenarios affording better odometry accuracy at lower computational requirements.

Description

Software Description

Software Language

Github

Keywords

Convolutional neural networks, Deep learning, LIDAR, Multi-dimensional processing, Recurrent neural networks, Relative navigation, Robotics

DOI

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International

Relationships

Relationships

Supplements