Peter Kalmus

and 4 more

Coral reefs are rapidly declining due to local environmental degradation and global climate change. In particular, corals are vulnerable to ocean heating. Anomalously hot sea surface temperatures (SSTs) create conditions for severe bleaching or direct thermal death. We use SST observations and CMIP6 model SST to project thermal conditions at reef locations at a resolution of 1 km, a 16-fold improvement over prior studies, under four climate emissions scenarios. We use a novel statistical downscaling method which is significantly more skillful than the standard method, especially at near-coastal pixels where many reefs are found. For each location we present projections of thermal departure (TD, the date after which a location with steadily increasing heat exceeds a given thermal metric) for severe bleaching recurs every 5 years (TD5Y) and every 10 years (TD10Y), accounting for a range of post-bleaching reef recovery/degradation. As of 2021, we find that over 91% and 79% of 1 km reefs have exceeded TD10Y and TD5Y, respectively, suggesting that widespread long-term coral degradation is no longer avoidable. We project 99% of reefs to exceed TD5Y by 2034, 2036, and 2040 under SSP5-8.5, SSP3-7.0, and SSP2-4.5 respectively. We project that 2%-5% of reef locations remain below TD5Y at 1.5 degrees Celsius of mean global heating, but 0% remain at 2.0 degrees Celsius. These results demonstrate the importance of further improving ecological projection capacity for climate-vulnerable marine and terrestrial species and ecosystems, including identifying refugia and guiding conservation efforts. Ultimately, saving coral reefs will require rapidly reducing and eliminating greenhouse gas emissions.

Jon Jenkins

and 11 more

The Surface Biology and Geology (SBG) mission is one of the core missions of NASA’s Earth System Observatory (ESO). SBG will acquire high resolution solar-reflected spectroscopy and thermal infrared observations at a data rate of ~10 TB/day and generate products at ~75 TB/day. As the per-day volume is greater than NASA’s total extant airborne hyperspectral data collection, collecting, processing/re- processing, disseminating, and exploiting the SBG data presents new challenges. To address these challenges, we are developing a prototype science pipeline and a full-volume global hyperspectral synthetic data set to help prepare for SBG’s flight. Our science pipeline is based on the science processing operations technology developed for the Kepler and TESS planet-hunting missions. The pipeline infrastructure, Ziggy, provides a scalable architecture for robust, repeatable, and replicable science and application products that can be run on a range of systems from a laptop to the cloud or an on-site supercomputer. Our effort began by ingesting data and applying workflows from the EO- 1/Hyperion 17-year mission archive that provides globally sampled visible through shortwave infrared spectra that are representative of SBG data types and volumes. We have fully implemented the first stage of processing, from the raw data (Level 0) to top-of-the-atmosphere radiance (Level 1R). We plan to begin reprocessing the entire 55 TB Hyperion data set by the end of 2021. Work to implement an atmospheric correction module to convert the L1R data to surface reflectance (Level 2) is also underway. Additionally, an effort to develop a hybrid High Performance Computing (HPC)/cloud processing framework has been started to help optimize the cost, processing throughput and overall system resiliency for SBG’s science data system (SDS). Separately, we have developed a method for generating full-volume synthetic data sets for SBG based on MODIS data and have made the first version of this data set available to the community on the data portal of NASA’s Advanced Supercomputing Division at NASA Ames Research Center. The synthetic data will make it possible to test parts of the pipeline infrastructure and other software to be applied for product generation.

Ann Raiho

and 14 more

The retrival algorithms used for optical remote sensing satellite data to estimate Earth’s geophysical properties have specific requirements for spatial resolution, temporal revisit, spectral range and resolution, and instrument signal to noise ratio (SNR) performance to meet science objectives. Studies to estimate surface properties from hyperspectral data use a range of algorithms sensitive to various sources of spectroscopic uncertainty, which are in turn influenced by mission architecture choices. Retrieval algorithms vary across scientific fields and may be more or less sensitive to mission architecture choices that affect spectral, spatial, or temporal resolutions and spectrometer SNR. We used representative remote sensing algorithms across terrestrial and aquatic study domains to inform aspects of mission design that are most important for impacting accuracy in each scientific area. We simulated the propagation of uncertainties in the retrieval process including the effects of different instrument configuration choices. We found that retrieval accuracy and information content degrade consistently at >10 nm spectral resolution, >30 m spatial resolution, and >8 day revisit. In these studies, the noise reduction associated with lower spatial resolution improved accuracy vis à vis high spatial resolution measurements. The interplay between spatial resolution, temporal revisit and SNR can be quantitatively assessed for imaging spectroscopy missions and used to identify key components of algorithm performance and mission observing criteria.