As remote sensing evolves, users across all applications are discovering the value of adding LiDAR instruments to a hyperspectral payload. First, instruments of all types are getting smaller and lighter. Second, UAVs are more capable than ever in terms of carrying capacity and airborne stability. Finally, users across agriculture, minerals and mining, and environmental research can now collect a full suite of useful data at one time. With powerful GPS/IMU devices to tie the data streams together, a completely integrated hyperspectral-LiDAR airborne package is now the 'gold standard' for many remote sensing missions.
Hyperspectral and LiDAR are very much complementary. A hyperspectral imaging sensor collects a full spectrum of data for every pixel within the field of view. For the Nano-Hyperspec sensor, that's 270 spectral bands for each of the 640 spatial pixels in each row of the image. The spectral data cover the visible and near infrared ranges and extend beyond human vision. Analysis of these spectra can be used in agricultural applications to detect anomalies such as crop diseases, water deficits, and overall plant stress and vigor. The combination of spatial imagery and spectral information is called a hyperspectral data cube, which can be several gigabytes in size.
The Nano-Hyperspec is a “pushbroom” line scan imager. That is, it takes one line of image data with each captured frame. As the imager moves over the scene being imaged, the hyperspectral image is built up line by line, or row by row. The raw data cube will, in general, have distortions due to variations in the air speed, altitude, roll, pitch and yaw of the aircraft. Through a process called “orthorectification”, the raw data cube is projected by Headwall’s SpectralView software onto a planar view of the terrain below, with uniformly spaced pixels. Accurately projecting a series of lines of pixels from a moving aircraft onto the ground requires information about the position of each pixel in the imager relative to the ground at each point in time.
The position of the imager relative to the center of the earth can be measured with a high-quality GPS/IMU instrument. The GPS (Global Positioning System) collects data on latitude, longitude, and altitude of the craft while aloft. The IMU (Internal Measurement Unit) accounts for any roll, pitch, and yaw of the UAV. To accurately calculate the size of each pixel in the image as projected on the ground, elevation of the ground relative to the center of the earth is needed as well. This information about elevation as a function of latitude and longitude can be stored in a Digital Elevation Model (DEM). Thus, proper orthorectification or line-scan imagery requires accurate GPS/IMU information as well as an accurate DEM.
The top image above is a 3D point-cloud; beneath that is 2D DEM made from the LiDAR point-cloud.
In fact, LiDAR data can be used to create an accurate DEM. The simultaneous data streams coming from Nano-Hyperspec and LiDAR are stored on the 480GB solid-state drive aboard Nano-Hyperspec. The advantage of collecting and storing hyperspectral and LiDAR data simultaneously is flight efficiency: More ground can be covered for any unit of time.
Turning raw LiDAR data into a 3-dimensional point cloud is done with Headwall’s LiDAR Tools software, which can also be used to generate the DEM to be used in the orthorectification process. Further software to fuse the hyperspectral and LiDAR data is in development now!