Opendata, web and dolomites

Report

Teaser, summary, work performed and final results

Periodic Reporting for period 2 - HawkEye (Vision-based Guidance and Control in Birds, with Applications to Autonomous Unmanned Aircraft)

Teaser

Birds have been called “une aile guidée par un oeil”, and whilst they possess many senses besides vision, their wings are indeed guided largely by their eyes. Nonetheless, we know surprisingly little of how birds use vision to guide their flight, and even less of their...

Summary

Birds have been called “une aile guidée par un oeil”, and whilst they possess many senses besides vision, their wings are indeed guided largely by their eyes. Nonetheless, we know surprisingly little of how birds use vision to guide their flight, and even less of their underlying guidance and control laws. This is unfortunate given the importance that vision is poised to assume in autonomous air vehicles, or drones. With good reason, the law still requires a human eye to remain in the loop, but as with the coming revolution in driverless cars, the future of unmanned flight lies in autonomy. Recall the £50M disruption to flights that was caused at London\'s Gatwick airport by reports of a drone that closed the runway for 33 hours in December 2018, and the lack of any available technology to spot the target and remove it safely from the airspace. Now picture a hawk, spotting its prey at a distance of over 1km, homing in on it visually whilst avoiding obstacles and clutter, and making the kill, and consider what engineering has to learn from biology. This project therefore has two overarching ambitions: 1) to revolutionize our understanding of vision-based guidance and control in birds; and 2) to carry these insights over to application in unmanned autonomous aircraft, including small interceptors designed to take out rogue drones in protected airspace. This presents a formidable technical challenge, but by combining a state-of-the-art motion capture suite with targets/obstacles moving under motion control in a custom-built flight facility, this project will identify the feedback laws and mechanisms underpinning perching, pursuit, obstacle avoidance, and gap negotiation in birds, together with the movements of their eyes, head, wings, and tail that they use to achieve these tasks. More than this, the project will identify the precise motion cues to which birds attend, settling longstanding questions on the extent to which guidance emerges from simple algorithmic rules versus state feedback and estimation, and with wider implications for our understanding of avian perception. The detailed insights that we gain from our experimental research with birds in the laboratory will be validated against computer simulations, and against measurements of their 3D flight trajectories and head movements in the field. These field studies will include chases of dummy targets or maneuvering drones in carefully-controlled experiments, and observational studies of birds chasing prey in the wild. The most commercially and technologically significant of the insights that we uncover will then be applied in the small drones that we will also be flying in the same flight facility. This work breaks new ground in all directions, drawing the study of biology, physics, engineering, and computer science together under one roof, in a unique, world-leading project delivering a combination of fundamental science and applied technologies.

Work performed

Year 1 of the project was spent designing, building, and commissioning the new 220sqm flight facility, which combines open-fronted aviaries providing the highest standards of animal husbandry with two indoor flight rooms equipped with a state-of-the-art motion capture system. We also completed the design and development of the suite of robotic targets/perches/gaps/obstacles that we use to drive goal-oriented behaviours in our birds, and acquired a flying team of 5 Harris\' Hawks and a breeding colony of Zebra Finches for the flight testing. Years 2-3 of the project have been spent acquiring flight test data from the birds in a range of different experimental setups studying visually-guided target pursuit, perching, obstacle avoidance, and gap negotiation behaviours. To date, we have collected detailed motion capture data from >15,000 flights, including every flight ever undertaken by 3 fledgling hawks in their first year. These motion capture data provide measurements of the instantaneous position of small, lightweight markers made of the same material as the reflective strips on a bike vest, which the birds wear on their feathers. The systems measures the position of these markers to within sub-millimetre accuracy at up to 400 times every second, whilst the birds fly through a large, 20m long flight volume. Analysis of these data is well underway, and will be published shortly.

We have also spent Years 1-3 of the project undertaking several experimental and observational studies in the field. These include tests with Peregrine Falcons and our regular flying team of Harris\' Hawks wearing small MEMS sensors, like those in a mobile phone, to measure their head and body movements whilst chasing maneuvering artificial targets. The results, published in the journal PNAS and reported on worldwide, show that peregrines have evolved to use the same guidance law as most guided missiles. The results of our studies with hawks show that they have evolved to use a slightly different guidance law, better adapted to their natural habit of close pursuit of prey through clutter, and will be published shortly. These field experiments have been complemented by three observational studies of attacks by hawks and falcons on wild prey, in which we have used stereo video techniques to reconstruct the 3D trajectories of the attacks and escapes that we observed. At the same time, we have developed a new physics-based flight simulator for birds that allows us to model the optimal attack and escape strategies of birds in a virtual environment, almost like a video game that is played by the computer against itself. The results of this work, published in the online journal PLoS Computational Biology and the Journal of Avian Biology, show that stooping from altitude increases catch success primarily by increasing the aerodynamic forces for manoeuvring. Remarkably, the simulations also confirm that catch success is maximised when the guidance law is tuned in the same way as we had found experimentally in our peregrines.

We are now in the process of applying some of the insights that we have obtained on target pursuit to drones flown indoors under the gaze of the motion capture system, which will be a primary focus of the work in the remainder of Years 3-5.

Final results

The sheer volume of data that we have collected - over fifteen thousand flights to date, and counting - is unprecedented in behavioural biomechanics. So too are the measurements using head-mounted sensors that we have made in the field, and the 3D flight trajectories that we have recorded of attacks and escapes in the wild. We have also had to develop many new ways of modelling these data, from the data processing required to track the markers, through the use of computer simulations to model the birds\' flight, to the combination of computer vision and deep learning to understand the optimization of the birds\' guidance and control. In Years 3-5, we expect to use these techniques to reveal many new insights to a wide audience or biologists, engineers, and the general public, and expect to apply our best ideas to the drones that we are now testing daily in the Flight Lab.

Website & more info

More info: https://flight.zoo.ox.ac.uk/.