Download IROS'02 paper [A4],
[US letter] PDF 212Kb
The robotic aircraft's 20-photoreceptor onboard eye senses moving contrasts with 19 ground-based neuromorphic EMDs. Visual, inertial, and tachymeter signals from the aircraft are scanned by a data acquisition board in the flight control computer which runs the Real-Time Linux operating system. A weighted average fusion of the visual inputs is used to command thrust. A PID controller regulates pitch. Flight commands are output via the parallel port to a microcontroller interfacing with a standard radio-control model transmitter. ![]() Vision-based terrain following (a) and landing (b) were simulated. Automatic obstacle-avoiding flights at speeds between 2 m/s and 3 m/s were demonstrated within the laboratory. This UAV project is at the intersection of Neurobiology, Robotics, and Aerospace. It provides principles and technology to assist urban operations of Micro Air Vehicles (MAV). ![]() Related projects for flight with insect visionBiorobotic Vision Laboratory (Srinivasan Lab), ANU, CanberraCenteye, Washington DC Dickinson Lab, Caltech, Pasadena Autonomous Systems Lab, EPFL, Lausanne Artificial Intelligence Lab, University of Zürich Hans van Hateren's Lab, University of Groningen Insect vision and motion detection at Institute of NeuroinformaticsTobi DelbrückGiacomo Indiveri Shih-Chii Liu Jörg Kramer |