Headwall Photonics Blog

The Eyes Have It...But Not Always

Posted by Christopher Van Veen on Thu, Apr 13, 2017

Humans have a marvelous ability to see and identify objects within what is called the visible range of the electromagnetic spectrum. That starts at roughly 380 nanometers and goes up to around 700 nanometers or so.

But there are things that researchers and scientists might wish to 'see' that fall below (ultraviolet) or above (infrared) this 'visible' portion of the spectrum. If you were a bumblebee, you could see into the UV range; a rattlesnake, you could see into the infrared range. Obviously (and thankfully) we're neither, but to see into these other ranges we need help. And why would we care about anything our eyes cannot see? Well, to take just precision agriculture as a key example, there are vegetative indices (VI's) that depend on seeing into the infrared ranges where the spectral signatures of chlorophyll fluorescence are detectable. Chlorophyll fluorescence is predictive of crop stress and vigor, so being able to see and quantify its effects can tell crop scientists much more than their own eyes can. 

DJI-MATRICE-600.jpg

Since we aren't bumblebees or snakes, we need tools to see into the nether regions of the spectral range. Hyperspectral and multispectral sensors do the work our eyes can't, and they do it very well. They collect, in the case of hyperspectral, a full spectrum of image data for every pixel within the field of view. Dozens of vegetative indices exist, with each using spectral data to discern answers to questions: Are there diseases on my crops I cannot see? Is my soil nutrient-rich? Are there invasive species I need to worry about? In the end, scientists concern themselves with finding answers to these and other questions rather than poring over complicated hyperspectral data cubes. In layman terms, you go to Home Depot not to buy a drill; you go there to buy a hole.

Hyperspectral and multispectral differ with respect to the amount of image data being collected. Hyperspectral is to multispectral what hundreds of bands are to a handful. With multispectral, you also may have gaps between the rather wide bands and what you want to detect with the sensor might not register. But hyperspectral represents hundreds of narrow and contiguous spectral bands, so if a certain spectral signature is there you'll see it. There are places for both multispectral and hyperspectral; if you know the spectral signature of the vegetative indices of interest and you're sure the multispectral sensor can capture it, you're all set. But much more common is the case where scientists do not exactly know where along the electromagnetic spectrum a key VI exists. Is it somewhere between 400 and 1000 nanometers (nm), which we call the visible-near-infrared VNIR range? Or is it in further up, between 900-2500nm (the shortwave-infrared SWIR range)? Indeed, missions may change over the course of the instrument's life, which means that scientists would opt for a combined VNIR-SWIR sensor capturing image data from 400 nm all the way up to 2500 nm.

Hyperspectral and multispectral imaging sensors are often 'line-scan' instruments, basically meaning they capture image data a slice at a time. The composition of all these slices (or frames) is a hyperspectral data cube, which can be several gigabytes in size. Post-processing software is very good at unscrambling this complex cube of data into meaningful answers, but just as important is aircraft stability. Since UAVs are quickly becoming the 'go-to' platform for crop scientists and others, making sure the craft is stable in the air is fundamental to making sure the data is orthorectified. In other words, not a casualty of a wobbling UAV. Fortunately, stabilized gimbals are outstanding nowadays, having the immediate ability to keep the sensor in its desired position no matter what the craft does.

Obviously, a UAV-based remote sensing system is a function of optics, electrics, and aerodynamics. Integration is an overlooked task, because many users assume that they can buy a UAV and a sensor and bolt the two together. Unfortunately, experience shows that such a piecemeal a-la-carte endeavor is likely to fail. Battery life comes into play, balance rears its head, and understanding the relationship between frame rate and ground speed can flummox anyone. Fortunately, though, companies like Headwall Photonics exist to manage this integration process. They understand a thing or two because they've seen a thing or two. They can recommend the right kind of UAV, take size/weight/power (SWaP) into consideration, integrate spectral sensors with other instruments such as LiDAR, and deliver turnkey, flight-ready packages that even bumblebees and snakes would have to admire. 

Tags: hyperspectral imaging, Airborne, Remote Sensing, Nano-Hyperspec