Over recent years, the use of Micro Aerial Vehicles (MAVs) in real world applications has evolved from concept through to creation. Systems have been designed and developed for exploration and mapping, fire monitoring, search and rescue and various other applications. MAVs that are currently in use are semi autonomous in nature i.e., processing of data and subsequent control of the vehicle has to be done manually and remotely. As such, there are several challenges associated with fully autonomous operation in dynamic environments; coupled with the fact that their airframes only support a reduced payload capacity (modern military MAV’s are approx 15cm and weigh only a few grams). This enforces the need for a guidance system that can be integrated onto the vehicle that does not interfere with its aerodynamics. Fundamentally, it requires that it be fast, reliable and accurate to ensure safe flight.
This thesis describes a generic platform for vision-based guidance that is partially inspired by the biological principles that underlie insect vision. Flying insects are intelligent organisms, which are capable of complex manoeuvres in dynamic environments. The insights gained by analysing the principles of insect flight, can be synthesized to real time reactive systems because they present novel and computationally elegant solutions for robotic guidance and navigation. This research works towards the vision of a fully autonomous MAV, “the size of an insect”, which can operate effectively in its immediate environment to perform certain goal orientated tasks. As such, it is the first step towards achieving that vision - with the development and implementation of a real time reconfigurable optical flow calculator, which can be used in conjunction with other systems to control an autonomous vehicle. The solution comprises a CMOS sensor that is directly interfaced to an on board Field Programmable Gate Array (FPGA), which computes an optical flow field of its surroundings in real time.
The prototype model has been tested in a range of indoor and outdoor environments, to ensure that it is robust and reliable. Indoor testing was done using synthetic textured visual environments. Outdoor testing has been carried out on a custom-built platform designed to test the operation of the system in a dynamic environment. Other testing has also been carried out using onboard flight footage of a UAV. The system that was developed works on images of resolution 640x480 pixels to compute an optical flow field comprising 88 vectors at a rate of 110 frames per second (fps), and is capable of a maximum of 150 fps. In addition, the imaging system also provides useful information about the vehicle’s surroundings, enabling the construction of terrain maps. The current system can be developed into a full-fledged ASIC at a later stage if needed; which can be integrated into a MAV or other autonomous vehicles as a plug and play module for artificial vision.