The innovations of the Fusion of Inertial Navigation and Imagery Data are the application of the concept to the dynamic entry-interface through near-landing phases, the autonomy and (near) real-time requirements of the system, and the focus on satisfying the stringent requirements for reliability and verification for spaceflight. This innovation will allow spacecraft to navigate autonomously, precisely and safely from entry-interface to near-landing. The plan is to develop automated techniques suitable for onboard software that incorporate recognized objects from imagery data into the vehicle's navigated solution. We will use image processing techniques to compare the imagery with expected views, pattern recognition techniques to identify known objects in the comparison, mechanisms for locating known objects using the navigated state, and filtering techniques to update the navigated state with the errors between the observed and expected results. To qualify as flight software, the proposed solution will be reliable and verifiable, and will satisfy limitations of the onboard equipment. No existing techniques solve all of these problems. Current techniques for incorporating imagery data into navigated solutions use sensors that have significantly shorter ranges, rely on registration markers placed on the target, use ground-based computational equipment, or require human intervention to arrive at a solution.