Energy consumption optimisation for unmanned aerial vehicle based on reinforcement learning framework

Date published

2024-04-16

Free to read from

Supervisor/s

Journal Title

Journal ISSN

Volume Title

Publisher

Inderscience

Department

Type

Article

ISSN

1742-4267

Format

Citation

Wang Z, Xing Y. (2024) Energy consumption optimisation for unmanned aerial vehicle based on reinforcement learning framework. International Journal of Powertrains, Volume 13, Issue 1, March 2024, pp.75-94

Abstract

The average battery life of drones in use today is around 30 minutes, which poses significant limitations for ensuring long-range operation, such as seamless delivery and security monitoring. Meanwhile, the transportation sector is responsible for 93% of all carbon emissions, making it crucial to control energy usage during the operation of UAVs for future net-zero massive-scale air traffic. In this study, a reinforcement learning (RL)-based model was implemented for the energy consumption optimisation of drones. The RL-based energy optimisation framework dynamically tunes vehicle control systems to maximise energy economy while considering mission objectives, ambient circumstances, and system performance. RL was used to create a dynamically optimised vehicle control system that selects the most energy-efficient route. Based on training times, it is reasonable to conclude that a trained UAV saves between 50.1% and 91.6% more energy than an untrained UAV in this study by using the same map.

Description

Software Description

Software Language

Github

Keywords

Power consumption, Machine Learning, Reinforcement Learning, trajectory optimization, Q- Learning, energy efficiency, path planning

DOI

Rights

Attribution-NonCommercial 4.0 International

Relationships

Relationships

Supplements

Funder/s