Autonomous navigation with taxiway crossings identification using camera vision and airport map
Date published
Free to read from
Supervisor/s
Journal Title
Journal ISSN
Volume Title
Publisher
Department
Type
ISSN
Format
Citation
Abstract
With increasing demands of unmanned aerial vehicle (UAV) operations envisioned for the future of aviation, the number of pilots will be much lower than the number of drones, necessitating an increased level of autonomy in drones to alleviate workload. Autonomous UAV taxiing enables autonomy to move on the ground, specifically from the gate to the runway and vice versa without human intervention. This study presents a lightweight vision-based autonomous taxiway navigation system, exploring the fusion of camera vision feed under the nose and airport map data to offer guidance and navigation. A sliding window mechanism is applied in centreline identification to detect line divergence. Centreline representations including divergence, direction and heading are cross-referenced with the airport database for localisation and generating navigation solutions. A simple proportional integral derivative (PID) controller is developed over aircraft dynamic models aligned with Eagle Dynamic’s Digital Combat Simulator to demonstrate the centreline following function. The overall system performance is assessed through simulations, encompassing individual functionality performance tests including centreline extraction test, line matching test, line-to-follow test, generalisation capability test, and computational complexity test. The performance evaluations indicate the promising potential of camera visions in enabling autonomous UAV taxiing with a 71% successful rate of detecting correct lines to follow and the remaining 29% as background. The proposed system also suggests a high generalization capability of more than a 67% success rate when testing over other paths. The source code of this proposition is open-sourced at https://github.com/DelQuentin/TaxiEye.