Skip Navigation
Space Technology Research Grants

Reconfiguring Worlds with Simple Actuation via Physics-Based Nonprehensile Actions

Completed Technology Project
348 views

Project Description

Reconfiguring Worlds with Simple Actuation via Physics-Based Nonprehensile Actions
Increased capability in autonomous navigation has revolutionized space exploration. Autonomous rovers have allowed us to explore further and for longer durations than human driven missions ever could. To date, missions for such rovers have been largely passive. These robots explore their environments, capturing images and other data measurements, but provide little direct interaction. As we begin to execute missions which aim not just to explore but to develop infrastructure on remote surfaces, the need for higher levels of autonomy greatly increases. The next revolution in space exploration lies in the ability to manipulate the environment in ways that increase capabilities of rovers and allow for execution of more complicated tasks. The central tenet of my proposal is: Incorporating simple physics-base actions, like pushing, toppling, and sliding, allow key physical manipulation tasks to be carried out without attaching extra expensive, heavy payload to the rover. Autonomous manipulation has been heavily studied over the past few decades. Much of the work has focused on pick-and-place actions. However, these types of actions have reached a limit. Humans interact with the environment using a diverse, rich set of actions beyond pick-and-place. Such interactions allow humans to manipulate objects too large or heavy to be carried. Actions such as pushing, pivoting and pulling rely on a set of simple expectations we as humans have about the physics of the object being moved and its environment. Enabling robots to use similar reasoning when planning manipulations tasks will greatly increase the capabilities of these robots, allowing smaller, less complex robots to be used in previously infeasible tasks. Search-based algorithms have been used extensively for generating motion plans for manipulators. Typically, these planners assume the object being manipulated is rigidly attached to the manipulator and search for a sequence of actions which move the manipulator, and thus the object, to a goal location. I propose to break this assumption and instead introduce manipulation planners that will search across action sets composed of physics-based actions. During planning, simple physics models can be used to calculate the response to these actions. Modifying the planners in this way will increase the tasks that can be planned and executed. Introducing manipulation tasks increases the need for reducing uncertainty in sensing. We can use the knowledge we gain from the interactions during the task to help with this. Previous work has shown that pushing an object can be used to reduce the uncertainty of the object's position relative to the manipulator. Similarly, you can image situations where touching a static object in the environment could eliminate uncertainty in the position of the robot. By incorporating these types of actions into our planners, we can develop plans which are robust to large amounts of uncertainty. The algorithms we develop will expand the types of tasks that can be asked of robots, while reducing the need to hard code task specific action sequences. This is particularly important for robots interacting in unknown environments, where exact mission sequences aren't known a priori. Introducing physics-based nonprehensile actions will allow for the use of smaller, simpler robots to perform complicated tasks. Applying these algorithms to NASA rovers will greatly increase autonomous missions on remote surfaces, opening the door for vast scientific discoveries. More »

Anticipated Benefits

Primary U.S. Work Locations and Key Partners

Technology Transitions

Light bulb

Suggest an Edit

Recommend changes and additions to this project record.
^