Uncertainty-based sensor fusion architecture using Bayesian-LSTM neural network

dc.contributor.authorGeragersian, Patrick
dc.contributor.authorPetrunin, Ivan
dc.contributor.authorGuo, Weisi
dc.contributor.authorGrech, Raphael
dc.date.accessioned2023-02-07T11:16:56Z
dc.date.available2023-02-07T11:16:56Z
dc.date.issued2023-01-19
dc.description.abstractUncertainty-based sensor management for positioning is an essential component in safe drone operations inside urban environments with large urban valleys. These canyons significantly restrict the Line-Of-Sight signal conditions required for accurate positioning using Global Navigation Satellite Systems (GNSS). Therefore, sensor fusion solutions need to be in place whichcan take advantage of alternative Positioning, Navigation and Timing (PNT) sensors such as accelerometers or gyroscopes to complement GNSS information. Recent stateof-art research has focused on Machine Learning (ML) techniques such as Support Vector Machines (SVM) that utilize statistical learning to provide an output for a given input. However, understanding the uncertainty of these predictions made by Deep Learning (DL) models can help improve integrity of fusion systems. Therefore, there is a need for a DL model that can also provide uncertainty-related information as the output. This paper proposes a Bayesian-LSTM Neural Network (BLSTMNN) that is used to fuse GNSS and Inertial Measurement Unit (IMU) data. Furthermore, Protection Level (PL) is estimated based on the uncertainty distribution given by the system. To test the algorithm, Hardware-In-the-Loop (HIL) simulationhas been performed,utilizingSpirent’s GSS7000 simulator and OKTAL-SE Sim3D to simulate GNSS propagation in urban canyons. SimSENSOR is used to simulate the accelerometer and gyroscope. Results show that Bayesian-LSTM provides the best fusion performance compared to GNSS alone, and GNSS/IMU fusion using EKF and SVM. Furthermore, regarding uncertainty estimates, the proposed algorithm can estimate the positioning boundaries correctly, with an error rate of 0.4% and with an accuracy of 99.6%.en_UK
dc.identifier.citationGeragersian P, Petrunin I, Guo W, Grech R. (2023) Uncertainty-based sensor fusion architecture using Bayesian-LSTM neural network. In: AIAA SciTech Forum 2023, 23-27 January 2023, National Harbor, Maryland, USA. Paper number AIAA 2023-0193en_UK
dc.identifier.urihttps://doi.org/10.2514/6.2023-0193
dc.identifier.urihttps://dspace.lib.cranfield.ac.uk/handle/1826/19137
dc.language.isoenen_UK
dc.publisherAIAAen_UK
dc.rightsAttribution-NonCommercial 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by-nc/4.0/*
dc.titleUncertainty-based sensor fusion architecture using Bayesian-LSTM neural networken_UK
dc.typeConference paperen_UK

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Uncertainty-based_sensor_fusion_architecture-2023.pdf
Size:
1.46 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.63 KB
Format:
Item-specific license agreed upon to submission
Description: