A comparative analysis of hybrid sensor fusion schemes for visual–inertial navigation
Date published
Free to read from
Supervisor/s
Journal Title
Journal ISSN
Volume Title
Publisher
Department
Type
ISSN
Format
Citation
Abstract
Visual Inertial Odometry (VIO) has been extensively studied for navigation in GNSS-denied environments, but its performance can be heavily impacted by the complexity of the navigation environments such as weather conditions, illumination variation, flight dynamics, and environmental structure. Hybrid fusion approaches integrating Neural Networks (NN), especially Gated Recurrent units (GRU) with the Kalman filters (KF), such as Error-State Kalman Filter (ESKF) have shown promising results mitigating system nonlinearities due to challenging environmental conditions data issues, there is a lack of systematic studies quantitively analysing and comparing performance differences unhand. To address this gap and enable robust navigation in complex conditions, this study proposes and systematically analyses the performance of three hybrid fusion schemes for VIO-based navigation of Unmanned Aerial Vehicles (UAV). These three hybrid VIO schemes include Visual Odometry (VO) error compensation using NN, KF error compensation using NN, and prediction of Kalman gain using NN. The comparative analysis is performed using data generated in MATLAB incorporating the Unreal Engine involving diverse challenging environmental conditions: fog, rain, illumination level variability and variability in the number of features available for extraction during the UAV flight in the urban environment. The results demonstrate the performance improvement achieved by hybrid VIO fusion schemes compared to ESKF-based traditional fusion methods in the presence of multiple visual failure modes. Comparative analysis reveals notable improvement achieved by method 1 with enhancements of 93% in sunny, 91% in foggy and 90% in rainy conditions than the other two hybrid VIO architectures.