An autonomous relative navigation system based on a combination of low cost infrared and vision sensors will be created. Such a system has the potential to be relatively small size, low cost, and capable of autonomous operation over a wide range from a few meters up to several kilometers, even on uncooperative objects such as dead satellites and space debris. This proposal uses recently developed COTS sensor hardware and robust algorithms to perform measurement modeling and simulation, relative navigation, object identification and state estimation that I will develop and code in software. I will combine new methods of image processing, object identification, tracking, and state estimation into an overall system that is robust to varied optical (lighting, focus) and range conditions. The system performance will be tested using high-fidelity simulated images. These software tools will then become part of the NASA AR&D Warehouse, and can be used for a wide range of proximity operations applications in future spacecraft missions.