Summary: | Visually guided navigation presents the perfect arena for studying the relationship between brain, body and environment and the behaviour which emerges from their interaction. It is a behaviour seen across the animal kingdom, from desert ants to humans, offering the possibility of common underlying computational principles. In addition, it represents an important applied problem: how to get robots to navigate reliably in situations where other kinds of information are lacking or absent. This thesis examines the problem of visual navigation in insects at different levels of abstraction using computational modelling, showing the power of this approach in describing and explaining insect behaviour, with a focus on image-based homing methods. It begins with an examination of an exploratory behaviour performed by naive foraging ants, known as ‘learning walks', and how the relationship between the shape of the learning walk and the visual form of the environment together determine homing success. I then proceed to look at the information carried by the visual receptive fields associated with a small number of neurons (of two classes) in Drosophila, showing that this corresponds to behavioural performance, without requiring any additional black boxes. Finally, I show that simple insect-inspired algorithms also perform well in different applied contexts, such as a flying agent and as a real-world visual compass. The contribution of this thesis is to show the value of computational modelling both in gaining an understanding of complex behaviours, particularly where many variables make more conventional analysis impossible, and in designing real-world applications.
|