Aurora proposes to develop a vision-based subsystem for incorporation onto Mars vehicles in the air (VTOL) and on the ground. NOAMAD will be an embedded hardware device with associated firmware for payloadlimited UAVs, performing autonomous navigation, obstacle avoidance, guidance using bio-inspired methods, and communication of information between agents within the autonomous team. NOAMAD will transition University of Maryland methods for insect-inspired, lightweight, vision- and optical sensor-based navigation methods into a subsystem that enables expansion of the exploratory capability of the vehicles on which it is installed. The subsystem will provide (1) localization (without a global navigation system or compass) using optic-flow based odometry combined with landmark detection, (2) obstacle detection and avoidance using optic flow, and (3) autonomous guidance using position information together with bio-inspired behaviors. Taken together, these functions will allow air and ground vehicles to work together to achieve progressively refined maps of an exploration region.