Enabling UAVs night-time navigation through mutual information-based matching of event-generated images
Date published
Free to read from
Supervisor/s
Journal Title
Journal ISSN
Volume Title
Publisher
Department
Type
ISSN
Format
Citation
Abstract
Advanced Air Mobility is expected to revolutionize the future of general transportation. However, to make it a reality, significant challenges arise requiring technologies to ensure the expected attributes in these scenarios: resilience, robustness, large operational range, high accuracy, low SWaP equipment, and real-time processing. Although existing visual-based navigation solutions for aerial applications provide outstanding results under nominal conditions, their performance is highly constrained by the lighting conditions, making them infeasible for real operations. With the main focus of addressing this limitation, and expanding the current operational range to include extreme low-illuminated environments, this paper presents a solution which leverages one of the most powerful properties of event cameras: their high dynamic range. Thus, data provided by an event camera (also called dynamic vision sensor) is used to estimate the relative displacement of a flying vehicle during night-time conditions. To that end, two different threads running in parallel have been developed: a reference map generator, operating at low frequency, focused on reconstructing a 2-D map of the environment, and a localization thread, which matches, at high frequency, real-time event-generated images against the reference map by applying Mutual Information to estimate the aircraft’s relative displacement.