Maintenance operations and scientific research on the International Space Station (ISS) require active monitoring. Currently the majority of monitoring and recording of data is performed by the ISS crew. These tasks, albeit relatively passive, often consume large blocks of a crew member¿¿Ωs time. In the future, it would be desirable to offload much of this observational work onto experts and technicians on the ground, enabling the ISS crew members to focus on setup, control, and other tasks requiring greater dexterity. In addition, as recent events have shown, there exists a possibility that the ISS will be uncrewed for a period of time. Flight controllers will want to have views of the ISS in cases when there are no crew. Such a remote monitoring system must be capable of providing a wide variety of camera perspectives, covering the majority of ISS's interior. It would be impractical to gain adequate coverage using a network of mounted camera systems. MIT Space Systems Laboratory developed the SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) to provide a platform for conducting experiments with free-flying satellites in space. We propose to develop stereo-based visual navigation and human interaction algorithms that will increase the capabilities of SPHERES and demonstrate those algorithms using a ground-based simulator. This results in more efficient and safer operation of space vehicles and frees up crew and ground control resources.