The Autonomous Rendezvous and Docking technology is intended to be integrated onto the Orion spacecraft for the future Asteroid Retrieval Mission (ARM) planned by NASA. This technology meets a key EMC requirement to for all future exploration architectures, namely the ability to navigate outside of low Earth orbit without the aid of ground support. The technology will allow vehicles to rendezvous and capture uncontrolled target objects, including asteroids and debris objects (including spent upper stages). In addition, this technology will allow manned vehicles, such as Orion, to rendezvous and dock/capture the unmanned vehicle attached to the asteroid in a Distant Retrograde Orbit (DRO) and other missions to Near Rectilinear Orbits (NROs). This project aligns well with the human spaceflight needs of NASA. In particular it aligns with Autonomous Vehicle Systems Management, and especially with Crew Autonomy beyond LEO. And, finally, it aligns loosely with Mission Control Automation beyond LEO. The need for autonomous vehicle operation and performance beyond LEO and in case of a loss-of-communications with Earth make the need for autonomous deep-space navigation imperative and central to the need to perform the mission and return the crew safely to Earth. Every future Exploration Architecture being considered by NASA has, at its core, the need to rendezvous and dock with other vehicles or bodies. Future manned vehicles need to be able to do so with both cooperative and uncooperative vehicles or objects. To this end, the sensors being considered are all optical-based. In fact, passive sensors, such as Infrared (IR) cameras and visual cameras, are at the heart of any exploration architecture, not least because they are small and lightweight. IR and optical cameras, in particular, have the added advantage of allowing for situational awareness. There is a need, therefore, for on-board systems to be able to use the images provided by these sensors to rendezvous and dock/capture (with) these objects. This project will, therefore, advance the state-of-the-art of onboard image-based navigation algorithms applied to AR&D with the added benefit of being extensible to being able to navigating around cratered objects such as the asteroids and the Moon. The technology, which will be harnessed, is called 'optical flow', also known as 'visual odometry'. It is being matured in the automotive and SLAM (Simultaneous Localization and Mapping) applications but has yet to be applied to spacecraft navigation. In light of the tremendous potential of this technique, we believe that NASA needs to design a optical navigation architecture that will use this technique. It is flexible enough to be applicable to navigating around planetary bodies, such as asteroids. Every future Exploration Architecture being considered by NASA has, at its core, the need for autonomous, rendezvous and docking (AR&D) capability with other vehicles or bodies. Future commercial manned and unmanned vehicles need to be able to do AR&D with both cooperative and uncooperative vehicles or objects. To this end, the sensors being considered are all optical-based. The technology developed in this project will directly utilize the measurements derived from these sensors. We anticipate that the use of the techniques developed here for optical navigation, particularly visual odometry have great potential in other fields such as robotics. In addition, it has military applications for navigating in/around buildings when GPS is not available. In fact, we intend to discuss the results of this research with our counterparts at the Air Force Research Labs (AFRL). JSC has had preliminary discussions with the AFRL to discuss the potential of these techniques.