Browsing by Author "Harman, Stephen"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
Item Open Access Advanced cognitive networked radar surveillance(IEEE, 2021-06-18) Jahangir, Mohammed; Baker, Chris J.; Antoniou, Michail; Griffin, Benjamin; Balleri, Alessio; Money, David; Harman, StephenThe concept of a traditional monostatic radar with co-located transmit and receive antennas naturally imposes performance limits that can adversely impact applications. Using a multiplicity of transmit and receive antennas and exploiting spatial diversity provides additional degrees of design freedom that can help overcome such limitations. Further, when coupled with cognitive signal processing, such advanced systems offer significant improvement in performance over their monostatic counterparts. This will also likely lead to new applications for radar sensing. In this paper we explore the fundamentals of multistatic network radar highlighting both potential and constraints whilst identifying future research needs and applications. Initial experimental results are presented for a 2-node networked staring radar.Item Open Access Developing drone experimentation facility: progress, challenges and cUAS consideration(IEEE, 2021-07-02) Panagiotakopoulos, Dimitrios; Williamson, Alex; Petrunin, Ivan; Harman, Stephen; Quilter, Tim; Williams-Wynn, Ian; Goudie, Gavin; Watson, Neil; Vernall, Phil; Reid, Jonathan; Puscius, Eimantas; Cole, Adrian; Tsourdos, AntoniosThe operation of Unmanned Aerial Systems (UAS) is widely recognised to be limited globally by challenges associated with gaining regulatory approval for flight Beyond Visual Line of Sight (BVLOS) from the UAS Remote Pilot. This challenge extends from unmanned aircraft flights having to follow the same ‘see and avoid’ regulatory principles with respect to collision avoidance as for manned aircraft. Due to the technical challenges of UAS and Remote Pilots being adequately informed of potential traffic threats, this requirement effectively prohibits BVLOS UAS flight in uncontrolled airspace, unless a specific UAS operational airspace is segregated from manned aviation traffic, often achieved by use of a Temporary Danger Area (TDA) or other spatial arrangements. The UK Civilian Aviation Authority (CAA) has defined a Detect and Avoid (DAA) framework for operators of UAS to follow in order to demonstrate effective collision avoidance capability, and hence the ability to satisfy the ‘see and avoid’ requirement. The National BVLOS Experimentation Corridor (NBEC) is an initiative to create a drone experimentation facility that incorporates a range of surveillance and navigation information sources, including radars, data fusion, and operational procedures in order to demonstrate a capable DAA System. The NBEC is part located within an active Airodrome Traffic Zone (ATZ) at Cranfield Airport, which further creates the opportunity to develop and test systems and procedures together with an operational Air Traffic Control (ATC) unit. This allows for manned and unmanned traffic to be integrated from both systems and procedural perspectives inside segregated airspace in a first stage, and then subsequently transiting to/from non-segregated airspace. The NBEC provides the environment in which a number of challenges can be addressed. This paper discusses the lack of target performance parameters, the methodology for gaining regulatory approval for non-segregated BVLOS flights and for defining peformance parameters for counter UAS (cUAS).Item Open Access Development of a passive dual channel receiver at L-band for the detection of drones(IEEE, 2022-06-02) Griffin, Benjamin; Balleri, Alessio; Baker, Chris; Jahangir, Mohammed; Harman, StephenStaring radars use a transmitting static wide-beam antenna and a directive digital array to form multiple simultaneous beams on receive. Because beams are fixed, the radar can employ long integration times to detect slow low-RCS targets, such as drones, which present a challenge to traditional air surveillance radar. The use of multiple spatially separated receivers cooperating with the staring transmitters in a multistatic network allows multi-perspective target acquisitions that can help mitigate interference and ultimately enhance the detection of drones and reduce estimation errors. Here, the development and experimental results of a passive, dual-channel, L-band receiver are presented. The receiver has been used to take measurements of both moving vehicles of drones in flight using a bistatic staring transmitter. An analysis of the receiver is presented using GPS is used to quantify the estimation performance of the receiver.Item Open Access Radar discrimination of small airborne targets through kinematic features and machine learning(IEEE, 2022-10-31) Doumard, Timothée; Gañán Riesco, Fabio; Petrunin, Ivan; Panagiotakopoulos, Dimitrios; Bennett, Cameron; Harman, StephenThis work studies binary classification problem for small airborne targets (drones vs other) by means of their trajectory analysis. For this purpose a set of the kinematic features extracted from drone trajectories using radar detections with a classification scheme that utilises Random Forests is proposed. The development is based on experimental data acquired from the Holographic radar from Aveillant Ltd. An approach for real-time classification is proposed, where an adaptive sliding window procedure is employed to make predictions over time from trajectories. Several models utilising different kinematic features (angle, slope, velocity, and their combination) are studied. The best model achieves an accuracy of more than 95%. In addition, fundamental issues with imbalanced datasets in the context of this topic are raised and illustrated using the collected data.Item Open Access Realistic simulation of drone micro-Doppler signatures(IEEE, 2022-06-02) Bennett, Cameron; Harman, Stephen; Petrunin, IvanThis paper presents a novel approach to simulating micro-Doppler signatures caused by drones. The focus of this work is to produce realistic signatures that represent the variation that is observed in live radar measurements. In order to accomplish this, the kinematics and dynamics of a drone flight are modelled to capture the changing rotor rotation rates. The simulation results show realistic variation that is representative of measured drone flights.Item Open Access Towards fully autonomous drone tracking by a reinforcement learning agent controlling a pan–tilt–zoom camera(MDPI, 2024-05-30) Wisniewski, Mariusz; Rana, Zeeshan A.; Petrunin, Ivan; Holt, Alan; Harman, StephenPan–tilt–zoom cameras are commonly used for surveillance applications. Their automation could reduce the workload of human operators and increase the safety of airports by tracking anomalous objects such as drones. Reinforcement learning is an artificial intelligence method that outperforms humans on certain specific tasks. However, there exists a lack of data and benchmarks for pan–tilt–zoom control mechanisms in tracking airborne objects. Here, we show a simulated environment that contains a pan–tilt–zoom camera being used to train and evaluate a reinforcement learning agent. We found that the agent can learn to track the drone in our basic tracking scenario, outperforming a solved scenario benchmark value. The agent is also tested on more complex scenarios, where the drone is occluded behind obstacles. While the agent does not quantitatively outperform the optimal human model, it shows qualitative signs of learning to solve the complex, occluded non-linear trajectory scenario. Given further training, investigation, and different algorithms, we believe a reinforcement learning agent could be used to solve such scenarios consistently. Our results demonstrate how complex drone surveillance tracking scenarios may be solved and fully autonomized by reinforcement learning agents. We hope our environment becomes a starting point for more sophisticated autonomy in control of pan–tilt–zoom cameras tracking of drones and surveilling airspace for anomalous objects. For example, distributed, multi-agent systems of pan–tilt–zoom cameras combined with other sensors could lead towards fully autonomous surveillance, challenging experienced human operators.