Skip Navigation
Instrument Incubator

Concurrent Artificially-intelligent Spectrometry and Adaptive Lidar System (CASALS)

Active Technology Project

Project Introduction

Concurrent Artificially-intelligent Spectrometry and Adaptive Lidar System (CASALS)
We propose to develop a brassboard version of a polar-orbiting SmallSat observing system which integrates adaptive lidar, hyperspectral imaging and on-board artificial intelligence (AI) technologies. The brassboard will demonstrate instrument performance required for space for key sub-systems. The technology, the Concurrent Artificially-intelligent Spectrometry and Adaptive Lidar System (CASALS), will provide high-priority measurements which support scientific studies and societal applications related to the carbon cycle and ecosystems, cryosphere response to climate change, natural hazards and atmospheric clouds and aerosols. The lidar, with altimeter and atmosphere profiling channels, will adaptively distribute the laser beam to specified locations across a swath using a photonic integrated circuit seed laser, wavelength tuning circuitry, a high-power fiber amplifier and a wavelength-to-angle mapping dispersive grating. This capability will enable measurement continuity with the ICESat-2 and GEDI lidar missions and, for the first time from space, achieve 3-D lidar imaging of a swath. The receiver telescope will employ free-form optics to substantially reduce its volume compared to traditional designs. A novel broadband filtering approach will reject solar background noise using a second grating. The signal photons will be detected using a linear-mode, photon-sensitive detector array with time-domain multiplexing electronics to differentiate from which location the photons are returned. The receiver telescope will be shared by the lidar and MiniSpec, a visible-NIR-SWIR hyperspectral imaging sensor which will provide information on target properties. MiniSpec also uses free-form optics for volume reduction. AI-assisted machine learning will be used for real-time hyperspectral image classification to support autonomous decision making, including optimized lidar beam targeting and data volume reduction by means of spectral band subsetting, on-orbit generation of products and product compression. Together, the concurrent information on vertical structure from the lidar and target properties from the spectral data will enable new scientific and application capabilities not achieved separately. The results will address five Earth Science Decadal Survey observable recommendations: ecosystem structure, ice elevation, snow depth and water equivalent, topography and 3-D vegetation and the atmosphere boundary layer. Our work will advance several space system technologies, including combined active/passive sensing, photonic integrated circuits, emerging sensor technologies, free-form optics and compact electronics. We will demonstrate smart sensing methods coupled to emerging machine learning information processing technologies. On-platform computational capacity will be used to coordinate among instruments and models of physical phenomenon, and react to changing environmental conditions. The period of performance is from January 1, 2020 to December 31, 2022. The technical readiness level at the start will be 2 and achieve 4 upon completion. More »

Anticipated Benefits

Primary U.S. Work Locations and Key Partners

Project Library

Share this Project

Organizational Responsibility

Project Management

Project Duration

Technology Maturity (TRL)

Technology Areas

Target Destination

Light bulb

Suggest an Edit

Recommend changes and additions to this project record.
^