Exploration of planetary environments with current robotic technologies relies on human control and power-hungry active sensors to perform even the most elementary low-level functions. Ideally, a robot would be able to autonomously explore novel environments, memorize locations of obstacles or objects, learn about objects, build and update an environment map, and return to a safe location. All of these tasks constitute typical activities efficiently performed by animals on a daily basis. The primary objective of the proposed research is to develop a biologically-inspired neuromorphic application that will translate the above-mentioned functionalities into an autonomous robot or unmanned aerial system (UAS). The Phase I effort implemented a neuromorphic system capable of exploring an unknown environment, avoiding obstacles, and returning to base for refuel/recharge without the use of a Global Navigation Satellite System (GNSS). This system was successfully tested in a Mars-like virtual environment and a simple robot. Leveraging Phase I results, the Phase II effort will develop visual processing based on passive sensors in order to find, identify, localize and interact with objects and use this information to enhance navigation capabilities. Neurala's neuromorphic application will also allow for human guidance through an intuitive user interface. Low-power hardware will be evaluated to facilitate real-time performance in robots and unmanned platforms.