Unmanned aerial vehicles (UAVs) are increasingly being used for a wide variety of civil and commercial applications such as infrastructure inspection and maintenance, search and rescue, mapping and cartography, as well as agricultural and environmental monitoring, to name just a few. Unmanned aircraft are suited to these roles because they can be smaller and lighter than manned aircraft and hence cheaper to operate, as well as being able to perform dull or repetitive tasks with greater precision than human operators, and dangerous tasks with greater safety. With the expanding set of roles for UAVs, there is an increasing need for them to be able to fly with a degree of low-level autonomy, thus freeing up their human controllers to concentrate on high-level decisions.
Modern UAVs control their position and orientation in space using technologies such as the Global Positioning System (GPS) and attitude and heading reference systems (AHRSs). They are unable to detect or avoid other objects or vehicles using these systems only, however, rendering them incapable of operating autonomously in near-Earth environments or around other moving vehicles. In these situations the aircraft must be able to monitor its surroundings continuously. Active proximity sensors, such as laser range-finders or radar, can be bulky, stealth-compromising, high-power, and low-bandwidth – limiting their utility for small-scale UAVs. There is considerable benefit to be gained, therefore, by designing guidance systems that use passive sensing, such as vision.
This thesis builds on recent research into biological vision-based flight control strategies to demonstrate that such bioinspired methods can offer dramatically improved sensing and control efficiencies over more complex computer vision-based approaches. Furthermore, this thesis establishes that wide-angle vision systems enable a broad array of sensing and guidance strategies, which can be implemented in parallel, in turn enabling complex flight behaviours that would traditionally require sensing and processing architectures incompatible with small-scale UAVs.
Two wide-angle vision systems are developed during the course of this research as well as a number of novel vision-based sensing and guidance algorithms, enabling detection of oncoming obstacles, estimation of attitude and altitude, long-term tracking of features, and interception of independently moving objects. Using these systems, complex capabilities such as terrain following at low altitude, aerobatics, landing in an uncontrolled environment, as well as tracking and interception of an independently moving object are all demonstrated for the first time using only computing resources available on board a small-scale UAV and using only vision for all sensing and guidance.
The findings of this thesis contribute towards a greater understanding of the minimum requirements – in terms of sensing and guidance architectures – for complex UAV behaviours; and the design methodologies proposed herein represent an important step towards full autonomy for small-scale airborne platforms, thereby contributing towards exploitation of UAVs for civil and commercial applications and bringing autonomous UAVs a step closer to the remarkable capabilities of their biological counterparts.