Robotic systems in space carry a lower risk tolerance than robotic systems on earth. Humans require faster learning curves for introduction of more complex robotics in space, but the only way to accomplish this is to acquire open source software on easily adaptable hardware. This will enable astronauts to perform multiple design cycles while they are in space, such as on the ISS. Swift Engineering is proposing a lightweight surround visual and sensory feedback system for robotic pilots that can easily be transferable, and is modular and scalable to any robotic system. Using 360 degree cameras, LIDAR, and a Myo armband, the robotic pilot will be able to quickly adapt to any environment from anywhere, including mission control. The key is that all of this work is being built from open source platforms so that nothing becomes overly proprietary, and astronauts can perform design cycles in space quickly and efficiently.