Biological systems use visual information in many different contexts. One example is the ability to focus attention, another the ability to use visual information as a navigational aid. As navigation is an important issue in the field of robotics, it is worth trying to build robots that are able to mimic the natural computation of visual information. This project examines and implements mechanisms to develop these behaviours on a mobile robot. A corridor following algorithm was implemented which used visual information to follow the corridor. It worked successfully in a number of different corridors.
Another point that is important in navigation is the ability to perceive distances. Indeed, distances are usually taken as parameters when a plan of the environment is created, either by an operator or by the robot itself. In this project a learning mechanism is used to learn the relationship between optic flow and ego-motion: the robot learns to estimate its own motion, judging from changes in the image. The robot is able to estimate distances with less than 10 % relative error, as long as the travelled distance is sufficiently long. For small distances the relative error increases.
I would like to thank my supervisor, Dr Ulrich Nehmzow and his PhD student, Stephen Marsland for their guidance and their useful advice.
I would also like to thank all the members of the Robotics group for their kindness and their help, especially Andrew Pickering for his assistance regarding the hardware.
And finally, I would like to thank Dr David Rydeheard for his invaluable advice on the successful completion of a MSc project. I dedicate this thesis to my family.