Skip Navigation
Transformative Aeronautics Concepts Program

Preliminary Computer Vision Datasets for Autonomous Aviation

Completed Technology Project
236 views

Project Description

Preliminary Computer Vision Datasets for Autonomous Aviation

Airborne Instrumentation for Real-world Video of Urban Environments (AIRVUE)

A NASA project developing computer vision algorithms for autonomous flight is producing real-world datasets using cameras mounted on aircraft. NASA innovators are sharing these open datasets with the broader computer vision and autonomy research community to advance autonomy developments in aviation. These datasets are the first to be released in an ongoing effort to collect and share large, diverse datasets relevant to autonomous aviation. Researchers anticipate this work will lead to innovations in computer vision, data fusion, and autonomous perception that will enable NASA’s vision for Advanced Air Mobility (AAM).

Computer vision development relies heavily on datasets collected in an application-relevant context, with reliable ground-truth data for evaluation. In the domain of autonomous driving, significant advances have resulted from the many datasets collected and published by various research teams around the world.

However, few vision datasets are publicly available in the aviation context. Researchers at NASA’s Armstrong Flight Research Center in Edwards, California, working with the Transformational Tools and Technology Project, have developed a sensor payload for aircraft with plans to collect large datasets in a variety of diverse flight situations. The platform aircraft include a multirotor small unmanned aerial system and a crewed helicopter as surrogates for future electric vertical take-off and landing (eVTOL) aircraft.

In preparation for a large data collection campaign, the project team conducted several preliminary flights and is sharing these preliminary datasets with the broader community to help foster development of autonomy and perception in the aviation domain. Datasets provide video imagery with associated inertial navigation system-global positioning system (INS-GPS) information and attitude estimates. Armstrong researchers welcome community critique that can inform and improve future flight campaigns.

This dataset contains video and trajectory data for the approach and landing of a vertical takeoff and landing (VTOL) aircraft, intended for the evaluation of visual simultaneous location and mapping (VSLAM) computer vision algorithms. The landing sites are ground level helipads and an elevated platform on the rooftop of a building in an urban area. The video segments contain various lighting conditions including dawn, midday, dusk, and night.

More »

Anticipated Benefits

Project Library

Primary U.S. Work Locations and Key Partners

Technology Transitions

Light bulb

Suggest an Edit

Recommend changes and additions to this project record.
^