{"project":{"acronym":"","projectId":89301,"title":"Human-Robot Collaboration on Complex Tasks","primaryTaxonomyNodes":[{"taxonomyNodeId":10629,"taxonomyRootId":8816,"parentNodeId":10628,"level":3,"code":"TX04.4.1","title":"Multi-Modal and Proximate Interaction","definition":"Multi-modal interaction allows for humans to interact with robots using multiple modes of communication, e.g. voice, gesture recognition. Proximate interaction allows for humans to interact with a robot side-by-side. Technologies to assist in these can enable humans to safely and efficiently control a larger number of robotic and autonomous assets, reducing overall demands on astronauts’ time for future exploration missions.","exampleTechnologies":"Virtual environment (VE), multi-modal dialogue, robot-to-suit interfaces, intent recognition and reaction, feedback displays for proximate interaction","hasChildren":false,"hasInteriorContent":true}],"startTrl":2,"currentTrl":3,"endTrl":3,"benefits":"
The end goal of this project is to achieve seamless human-robot cooperation on complex tasks, approaching the ease and accuracy of human-human collaboration. The ultimate impact is a world where robots actively interpret a person's instructions, asking questions when they are confused, and asking for help when they encounter a problem.
","description":"The aim of this proposal is to test the hypothesis that a system can increase speed and accuracy at inferring human intentions and increase the number of robots a single astronaut can supervise by inferring a person’s mental state from their language utterances and actively asking questions when it is confused. The approach is driven by a formal model for human-robot collaboration that enables a robot to make plans without complete information and reason about what a person wants. This inference will enable a person to effectively communicate complex requirements and tasks constraints to the person at very abstract levels (e.g., \"Inspect the ISS\") and at very specific levels (e.g., \"Move left six inches,\") as well as recovering from failure by asking targeted questions (e.g., \"Do you mean the Philips screwdriver or the flat head?\"). The end goal of this project is to achieve seamless human-robot cooperation on complex tasks, approaching the ease and accuracy of human-human collaboration. Humans communicating with other humans use language to express very abstract goals as well as very low-level concrete commands. This use of language enables high levels of autonomy but also supports flexible and fluid corrections and replanning. Robots that can use fluid language at multiple levels of abstraction can flexibly respond to a person's requests. The ultimate impact is a world where robots actively interpret a person’s instructions, asking questions when they are confused, and asking for help when they encounter a problem.
","startYear":2016,"startMonth":10,"endYear":2019,"endMonth":10,"statusDescription":"Completed","principalInvestigators":[{"contactId":442010,"canUserEdit":false,"firstName":"Stefanie","lastName":"Tellex","fullName":"Stefanie Tellex","fullNameInverted":"Tellex, Stefanie","primaryEmail":"stefie10@cs.brown.edu","publicEmail":false,"nacontact":false}],"programDirectors":[{"contactId":84634,"canUserEdit":false,"firstName":"Claudia","lastName":"Meyer","fullName":"Claudia M Meyer","fullNameInverted":"Meyer, Claudia M","middleInitial":"M","primaryEmail":"claudia.m.meyer@nasa.gov","publicEmail":true,"nacontact":false}],"programExecutives":[{"contactId":84634,"canUserEdit":false,"firstName":"Claudia","lastName":"Meyer","fullName":"Claudia M Meyer","fullNameInverted":"Meyer, Claudia M","middleInitial":"M","primaryEmail":"claudia.m.meyer@nasa.gov","publicEmail":true,"nacontact":false}],"programManagers":[{"contactId":183514,"canUserEdit":false,"firstName":"Hung","lastName":"Nguyen","fullName":"Hung D Nguyen","fullNameInverted":"Nguyen, Hung D","middleInitial":"D","primaryEmail":"hung.d.nguyen@nasa.gov","publicEmail":true,"nacontact":false}],"projectManagers":[{"contactId":276167,"canUserEdit":false,"firstName":"Kimberly","lastName":"Hambuchen","fullName":"Kimberly A Hambuchen","fullNameInverted":"Hambuchen, Kimberly A","middleInitial":"A","primaryEmail":"kimberly.a.hambuchen@nasa.gov","publicEmail":true,"nacontact":false}],"website":"https://www.nasa.gov/strg#.VQb6T0jJzyE","libraryItems":[],"transitions":[{"transitionId":75957,"projectId":89301,"transitionDate":"2019-10-01","path":"Closed Out","details":"The aim of this proposal is to test the hypothesis that a system can increase speed and accuracy at inferring human intentions and increase the number of robots a single astronaut can supervise by inferring a person’s mental state from their language utterances and actively asking questions when it is confused. The approach is driven by a formal model of HRI that learns a hierarchical representation for planning under uncertainty, called the Human-Robot Collaborative POMDP. This inference will enable a person to effectively communicate complex requirements and tasks constraints to the person at very abstract levels (e.g., \"Inspect the ISS\") and at very specific levels (e.g., \"Move left six inches,\") as well as recovering from failure by asking targeted questions (e.g., \"Do you mean the Philips screwdriver or the flat head?\").
","infoText":"Closed out","infoTextExtra":"","dateText":"October 2019"}],"responsibleMd":{"acronym":"STMD","canUserEdit":false,"city":"","external":false,"linkCount":0,"organizationId":4875,"organizationName":"Space Technology Mission Directorate","organizationType":"NASA_Mission_Directorate","naorganization":false,"organizationTypePretty":"NASA Mission Directorate"},"program":{"acronym":"STRG","active":true,"description":"\tThe Space Technology Research Grants Program will accelerate the development of "push" technologies to support the future space science and exploration needs of NASA, other government agencies and the commercial space sector. Innovative efforts with high risk and high payoff will be encouraged. The program is composed of two competitively awarded components.
","programId":69,"responsibleMd":{"acronym":"STMD","canUserEdit":false,"city":"","external":false,"linkCount":0,"organizationId":4875,"organizationName":"Space Technology Mission Directorate","organizationType":"NASA_Mission_Directorate","naorganization":false,"organizationTypePretty":"NASA Mission Directorate"},"responsibleMdId":4875,"stockImageFileId":36658,"title":"Space Technology Research Grants"},"leadOrganization":{"canUserEdit":false,"city":"Providence","country":{"abbreviation":"US","countryId":236,"name":"United States"},"countryId":236,"external":true,"linkCount":0,"organizationId":3955,"organizationName":"Brown University","organizationType":"Academia","stateTerritory":{"abbreviation":"RI","country":{"abbreviation":"US","countryId":236,"name":"United States"},"countryId":236,"name":"Rhode Island","stateTerritoryId":8},"stateTerritoryId":8,"murepUnitId":217156,"naorganization":false,"organizationTypePretty":"Academia"},"supportingOrganizations":[{"acronym":"JSC","canUserEdit":false,"city":"Houston","country":{"abbreviation":"US","countryId":236,"name":"United States"},"countryId":236,"external":false,"linkCount":0,"organizationId":4853,"organizationName":"Johnson Space Center","organizationType":"NASA_Center","stateTerritory":{"abbreviation":"TX","country":{"abbreviation":"US","countryId":236,"name":"United States"},"countryId":236,"name":"Texas","stateTerritoryId":29},"stateTerritoryId":29,"naorganization":false,"organizationTypePretty":"NASA Center"}],"statesWithWork":[{"abbreviation":"RI","country":{"abbreviation":"US","countryId":236,"name":"United States"},"countryId":236,"name":"Rhode Island","stateTerritoryId":8}],"lastUpdated":"2024-2-6","releaseStatusString":"Released","viewCount":448,"endDateString":"Oct 2019","startDateString":"Oct 2016"}}