Skip Navigation
Space Technology Research Grants

Collaborative Augmented Reality with Hands‐Free Gesture Control for Remote Astronaut Training and Mentoring

Completed Technology Project
958 views

Project Description

Collaborative Augmented Reality with Hands‐Free Gesture Control for Remote Astronaut Training and Mentoring
Augmented Reality (AR), where 3D and 2D images are overlaid on a user's natural field of view like a heads up display, is being developed for use in manned space applications. In the past 2 years, NASA, through AR-eProc, and the European Space Agency, through MARSOP, have demonstrated the potential of AR to improve hands-on training and just-in-time training (JITT). This research aims demonstrate the feasibility and test the performance of novel hands-free, remote or proximity, and instantaneous collaboration or mentoring between individuals in the AR environment to augment audio communication for training, task completion, and other space applications. Real-time AR collaboration can provide novel, flexible spontaneous communication capabilities not found in current systems. Collaboration would be facilitated by drawing  on, or annotating, preexisting AR graphics using tablets or gesture detection which can also be viewed by other individuals in real time. Hands-free operation would be accomplished by utilizing radar-based gesture recognition transceivers which are environment-robust and embeddable in rigid materials. They precisely measure motion in 3D, allowing for gestures adapted to limited ranges of motion. Embedded gesture sensors and AR headsets enable potential expansion of use to EVA or other space suit applications. The research proposed aims to demonstrate the feasibility of the combined use of such systems and evaluate their performance. This would fulfill research in Technical Area 04: Robotics and Autonomous Systems Section 4 items 3 and 5 (4.4.3,5). The potential for AR control in EVA applications opens future development in AR communication with autonomous systems (4.4.3). This research by its remote collaborative nature can provide tools that facilitate resource and task allocation, trading and sharing of control, and dialogue management.  Use Cases: Remote or proximity AR collaboration or mentoring on Earth, in deep sea, or in Low Earth Orbit. Proximity-only AR collaboration or mentoring in deep space or situations with low communication bandwidth to remote sources. Approach: First, the AR collaborative drawing  functionality will be implemented using touchscreens with the AR-eProc system developed at NASA Johnson Space Flight Center. Next, gestures appropriate for hands-free drawing  will be developed. These gestures will be integrated into AR-eProc. Sample instruction modules, in increasing sophistication: without AR, with static AR images, with touchscreen AR collaboration, and with radar-based AR gesture collaboration; will be tested with volunteers. The system's impact on user comfort, efficiency, and perceived utility in job environments will be assessed. Iterative, agile approaches will be used to modify software development as assessments reveal areas of needed improvement. Evaluation of AR platforms will use task completion time, the subjective workload assessment tool NASA TLX, and subjective surveys of test subjects to link AR interactivity with changing levels of productivity. Feasibility of radar gesture recognition will be evaluated as recognition accuracy as the percentage of recognized gestures from the number of properly executed gestures during testing. Needed Resources: The AR-eProc developed at NASA Johnson, especially the hardware headset, is needed. Additional Microsoft Hololens sets will be available for sale in 2016 for the price of $3000 each. For radar-based gesture detection, a development kit from Google ATAP's Project Soli would provide the sensor and application programming interface, allowing for the rapid creation of gesture recognition patterns. Otherwise, microwave Doppler sensors are available commercially off-the-shelf. Doshisha University in Japan has demonstrated gesture recognition using the commercial NJR4211J Microwave Doppler sensor. More »

Anticipated Benefits

Primary U.S. Work Locations and Key Partners

Technology Transitions

Light bulb

Suggest an Edit

Recommend changes and additions to this project record.
^