Future generations of NASA land/aerial robots will be required to operate in the harsh, unpredictable environments of extra-terrestrial bodies including asteroids, planets, and moons. To compensate for the unpredictability of their operating environments and the vast communication distances with Earth these robots must exhibit an extremely high degree of autonomous behavior through highly customized, on-board hardware. One specific problem is autonomous navigation. Robots must be able to autonomously navigate their target environments by processing visual data. Furthermore, the temperature and radiation hazards of outer space introduce a large number of run-time faults into electronic components potentially compromising the autonomous behavior of these robots. However, the human brain, a device composed of billions of noisy devices, can accurately perform highly complex navigational tasks in spite of its noisy constituent devices. We propose to develop customized Application Specific Integrated Circuit (ASIC) chips that draw inspiration from the brain in terms of their fault-tolerant structures and their approach to solving difficult navigational tasks. First, through the use of pairs of stereo vision ASIC chips that merge photoreceptive cells and computational elements we plan to develop a neuromorphic system that allows a robot to learn appropriate actions about its environment from visual data. Second, we will investigate the use of placing biologically-inspired structures composed of faulty devices on CMOS chips to create fault-tolerant, robust computational elements capable of recovering from run-time errors. These approaches will allow robots exploring extra-terrestrial objects to use visual data to learn about and make progressively better decisions about their unknown environments and allow their hardware to recover from unforeseen errors. Both of these approaches act to increase the lifetime operation of future NASA land/aerial robots as they push outwards into the Solar System and beyond.
More »These approaches will allow robots exploring extra-terrestrial objects to use visual data to learn about and make progressively better decisions about their unknown environments and allow their hardware to recover from unforeseen errors. Both of these approaches act to increase the lifetime operation of future NASA land/aerial robots as they push outwards into the Solar System and beyond.
More »Organizations Performing Work | Role | Type | Location |
---|---|---|---|
Boston University | Lead Organization | Academia | Boston, Massachusetts |
Jet Propulsion Laboratory (JPL) | Supporting Organization | FFRDC/UARC | Pasadena, California |