NASA is interested in a reliable, robust, and low Size Weight and Power (SWAP) input device that will allow for EVA astronauts to navigate display menu systems. The resulting input device should provide mouse-like functionality and need minimal hand use. Cybernet proposes a solution that does not require any hand or glove control. Instead, we propose an input device that uses purposes eye-blinks, eye motions, and limited vocal commands for display menu navigation. Our reasoning is that the astronaut, especially on EVA, needs a method of accessing display menus in a minimally intrusive way. Their hands are usually occupied, and so using them for mouse-like gestures is impractical. Taking a cue from Google Glass, and based on our previously developed eye tracking system and voice interaction system developed separately for NASA, we are confident we can create a system that takes purposes eye blinks and motions that allows the astronaut to navigate display menus without interfering with other work. Specifically, during the Phase I we will create a feasibility demonstration that does the following: eye gaze, purposive eye blinks, and limited vocabulary voice commands. The combination of the above three input methods should be relatively easy to learn and use (i.e. minimal practice) and should not interfere with normal EVA operations. What is needed, though, is a small camera/microphone that is located within the astronaut's helmet that continually has a view of one or both of the astronaut's eyes. During the Phase I we will implement a feasibility proof of the above input methods and research appropriate hardware. During the Phase II we will acquire hardware similar for a full prototype system that will enable us to demonstrate low SWAP, as well as measure accuracy and utility.