Document Type


Publication Date



Atmospheric compensation of long-wave infrared (LWIR) hyperspectral imagery is investigated in this article using set representations learned by a neural network. This approach relies on synthetic at-sensor radiance data derived from collected radiosondes and a diverse database of measured emissivity spectra sampled at a range of surface temperatures. The network loss function relies on LWIR radiative transfer equations to update model parameters. Atmospheric predictions are made on a set of diverse pixels extracted from the scene, without knowledge of blackbody pixels or pixel temperatures. The network architecture utilizes permutation-invariant layers to predict a set representation, similar to the work performed in point cloud classification. When applied to collected hyperspectral image data, this method shows comparable performance to Fast Line-of-Sight Atmospheric Analysis of Hypercubes-Infrared (FLAASH-IR), using an auto- mated pixel selection approach. Additionally, inference time is significantly reduced compared to FLAASH-IR with predictions made on average in 0.24 s on a 128 pixel by 5000 pixel data cube using a mobile graphics card. This computational speed-up on a low-power platform results in an autonomous atmospheric compensation method effective for real-time, onboard use, while only requiring a diversity of materials in the scene.


©The Authors. This is an open access article published by IEEE and distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. CC BY 4.0

Sourced from the published version of record cited below.

* Author notes: Author marked [*] was an AFIT graduate student at the time of publication. K. Gross co-affiliated with AFIT as an adjunct faculty member.

Source Publication

IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing