Headwall Photonics Blog

Hyperspec SWIR Gets UAV Wings!

Posted by Christopher Van Veen on Thu, Sep 14, 2017

Remote sensing with hyperspectral sensors is  a combination of several elements: an imaging spectrometer and a fast data processing system to acquire and analyze spectral and spatial data. For remote sensing missions such as crop disease or invasive species detection, the spectral range of most interest is the visible-near-infrared (VNIR) from 400-1000nm.

Geologic exploration requires a different spectral region as spectral signatures of interest are evident in the shortwave-infrared (SWIR) range of 900-2500nm. Each type of mineral has its own unique spectral signature reflecting its chemical composition. Since certain minerals reflect light slightly differently, it's important to have many contiguous spectral bands.  

Given the sensor requirements for generating SWIR hyperspectral data, spectral imaging sensor systems require efficient consideration of size, weight, and power requirements for deployment on UAVs. Headwall has worked strategically to 'lightweight' its Hyperspec sensors for field deployment. One of the main design aspects is that Headwall uses an 'all-reflective' approach which allows for a very small instrument size and form factor. The Hyperspec sensors are smaller and lighter but also more robust in airborne situations, particularly for UAV deployment. 

DJI-with-SWIR-small.jpg

Headwall's Micro-Hyperspec SWIR is being successfully deployed on the multi-rotor UAVs with excellent stability and imaging results. What's more, the entire payload also includes a high-performance GPS/IMU and Headwall's HyperCore data-fusion hub that synthesizes the data streams coming from complementary instruments. For example, LiDAR can also be integrated to the payload and handled by HyperCore.  

There are several aspects of Headwall’s spectral imaging sensor design that are advantageous for geological research. “When you’re using a UAV, flight time is crucial,” said Peter Clemens, Director of Engineering at Headwall. “Efficiently capturing wide swaths of valuable image data on each flight is therefore a huge advantage.” Headwall has built ortho-rectification into its Hyperspec spectral software allowing users to generate highly accurate geospatial data. “We give our customers the technology to identify not only what they see within the scene but a precise location as to where it is on the ground,” continued Clemens. Headwall has developed solution bundles that package both hardware and software systems necessary so that customers can quickly acquire spectral data but also efficiently process this data and view spectral maps.

Tags: hyperspectral imaging, Airborne, Remote Sensing, geology

The Eyes Have It...But Not Always

Posted by Christopher Van Veen on Thu, Apr 13, 2017

Humans have a marvelous ability to see and identify objects within what is called the visible range of the electromagnetic spectrum. That starts at roughly 380 nanometers and goes up to around 700 nanometers or so.

But there are things that researchers and scientists might wish to 'see' that fall below (ultraviolet) or above (infrared) this 'visible' portion of the spectrum. If you were a bumblebee, you could see into the UV range; a rattlesnake, you could see into the infrared range. Obviously (and thankfully) we're neither, but to see into these other ranges we need help. And why would we care about anything our eyes cannot see? Well, to take just precision agriculture as a key example, there are vegetative indices (VI's) that depend on seeing into the infrared ranges where the spectral signatures of chlorophyll fluorescence are detectable. Chlorophyll fluorescence is predictive of crop stress and vigor, so being able to see and quantify its effects can tell crop scientists much more than their own eyes can. 

DJI-MATRICE-600.jpg

Since we aren't bumblebees or snakes, we need tools to see into the nether regions of the spectral range. Hyperspectral and multispectral sensors do the work our eyes can't, and they do it very well. They collect, in the case of hyperspectral, a full spectrum of image data for every pixel within the field of view. Dozens of vegetative indices exist, with each using spectral data to discern answers to questions: Are there diseases on my crops I cannot see? Is my soil nutrient-rich? Are there invasive species I need to worry about? In the end, scientists concern themselves with finding answers to these and other questions rather than poring over complicated hyperspectral data cubes. In layman terms, you go to Home Depot not to buy a drill; you go there to buy a hole.

Hyperspectral and multispectral differ with respect to the amount of image data being collected. Hyperspectral is to multispectral what hundreds of bands are to a handful. With multispectral, you also may have gaps between the rather wide bands and what you want to detect with the sensor might not register. But hyperspectral represents hundreds of narrow and contiguous spectral bands, so if a certain spectral signature is there you'll see it. There are places for both multispectral and hyperspectral; if you know the spectral signature of the vegetative indices of interest and you're sure the multispectral sensor can capture it, you're all set. But much more common is the case where scientists do not exactly know where along the electromagnetic spectrum a key VI exists. Is it somewhere between 400 and 1000 nanometers (nm), which we call the visible-near-infrared VNIR range? Or is it in further up, between 900-2500nm (the shortwave-infrared SWIR range)? Indeed, missions may change over the course of the instrument's life, which means that scientists would opt for a combined VNIR-SWIR sensor capturing image data from 400 nm all the way up to 2500 nm.

Hyperspectral and multispectral imaging sensors are often 'line-scan' instruments, basically meaning they capture image data a slice at a time. The composition of all these slices (or frames) is a hyperspectral data cube, which can be several gigabytes in size. Post-processing software is very good at unscrambling this complex cube of data into meaningful answers, but just as important is aircraft stability. Since UAVs are quickly becoming the 'go-to' platform for crop scientists and others, making sure the craft is stable in the air is fundamental to making sure the data is orthorectified. In other words, not a casualty of a wobbling UAV. Fortunately, stabilized gimbals are outstanding nowadays, having the immediate ability to keep the sensor in its desired position no matter what the craft does.

Obviously, a UAV-based remote sensing system is a function of optics, electrics, and aerodynamics. Integration is an overlooked task, because many users assume that they can buy a UAV and a sensor and bolt the two together. Unfortunately, experience shows that such a piecemeal a-la-carte endeavor is likely to fail. Battery life comes into play, balance rears its head, and understanding the relationship between frame rate and ground speed can flummox anyone. Fortunately, though, companies like Headwall Photonics exist to manage this integration process. They understand a thing or two because they've seen a thing or two. They can recommend the right kind of UAV, take size/weight/power (SWaP) into consideration, integrate spectral sensors with other instruments such as LiDAR, and deliver turnkey, flight-ready packages that even bumblebees and snakes would have to admire. 

Tags: hyperspectral imaging, Airborne, Remote Sensing, Nano-Hyperspec

Data Fusion: A New Capability for the Remote Sensing Community

Posted by Christopher Van Veen on Tue, Mar 01, 2016

We’re seeing a tremendous increase in the number of airborne deployments for our hyperspectral imaging sensors. To a large degree, the trend toward smaller and more affordable UAVs is giving the remote sensing community more flexibility to undertake more missions to capture meaningful environmental data. From wine-grape vineyards in northern California to coffee bean plantations in South America, the precision agriculture community is embracing packaged ‘UAS’ offerings that combine a UAV matched to the payload it needs to carry.

hypercore_illustration.jpgCollecting meaningful, actionable data for a precision agriculture scientist can mean the difference between a healthy harvest and a disastrous one. Depending on the wavelength, the sensors will spot indices indicative of diseases, irrigation deficits, crop stress, and more. An affordable UAV thus takes the place of much more expensive manned aircraft flights. The financial savings notwithstanding, this new system can be hand launched and retrieved and basically deployed wherever and whenever (with adherence to all local aviation rules and regulations).

One trend we’re seeing at Headwall is the integration of multiple sensors, each having their own specific streams of data. For example, the payload might comprise a VNIR (400-1000nm) sensor along with a SWIR (1000-2500nm) instrument, but might also include LiDAR and typically a GPS/IMU. A Fiber-Optic Downwelling Irradiance Sensor (FODIS) is also often used to measure and collect data relative to changes in solar illumination.

Obviously, payload restrictions determine what the craft can lift and for how long. But it is a balance between choosing an affordable UAV that is small and light while understanding that it might not be able to carry all the instruments that a remote sensing mission might demand. Optimizing Size, Weight & Power (SWaP) is the guiding principle for missions involving UAVs. There are many fixed-wing and multi-rotor UAVs on the market that all specify their payload restrictions and flight durations.

The goal of any remote sensing activity is to see the unseen, and then make sense of the data during post processing. Because this data leads to important agricultural decisions, it’s crucial to synthesize the data from each instrument. The term for this is data fusion, and Headwall has just unveiled a new product called HyperCore™ that handles this important task. As the UAV flies its mission, data streams from the hyperspectral sensors, GPS/IMU, LiDAR, and other instruments are all collected on HyperCore’s 500GB drive for easy download (via Gig-E) later. HyperCore includes the most-used connections, including two Gig-E ports, a CameraLink port, power, and two I/O ports.


 

Tags: Airborne, Remote Sensing, UAS, UAV

Headwall Delivers Micro-Hyperspec® Sensors to Columbia University

Posted by Christopher Van Veen on Thu, Oct 09, 2014

High-performance imaging sensors on small, commercial UAS will assess ocean and sea ice variability in Arctic zones

FITCHBURG, MA - OCTOBER 9, 2014: Headwall Photonics has delivered two high-performance hyperspectral imaging sensors to Columbia University as part of its Air-Sea-Ice Physics and Biogeochemistry Experiment (ASIPBEX). ASIPBEX is part of a larger international collaborative investigation of Climate Cryosphere Interaction with colleagues from Spain, Germany and Norway. This crucial remote-sensing project will use a high-endurance unmanned aircraft system (UAS) to investigate climatological changes present in the Arctic Ocean around Svalbard, Norway. The instrument payload comprises two Micro-Hyperpsec sensors; one will cover the Visible-Near-Infrared (VNIR) range of 400-1000nm while the other will cover the Near-Infrared (NIR) range of 900-1700nm. Together, the sensors will be crucial in detecting indicators of sea ice physics, solar warming and global carbon cycles.

 

UAS and Micro-Hyperspec"We chose the Headwall sensors for several reasons," stated Christopher Zappa, a Lamont Research Professor at Columbia's Lamont-Doherty Earth Observatory. "The very high resolution allows us to collect and process vast amounts of spectral and spatial data upon which our research and analysis depend." The wide field of view of the Headwall sensor combined with aberration-corrected optics also contributes to overall flight-path efficiency. The UAS allows scientists to measure in places that typically are impossible to get to using ships or manned aircraft. This opens up the possibility for transformative understanding of the climate system. "Since we're using a UAS, we depend on 'seeing' as much of the ocean surface as possible, minimizing any aberrations or unwanted artifacts along the edges of the field of view," noted Prof. Zappa. The combination of Micro-Hyperspec and Headwall's advanced Hyperspec III airborne software allows for the successful collection, classification, and interpretation of the spectral data collected during each flight.

 

This particular deployment for the ASIPBEX project is fundamental to Headwall's strategy of advancing the science of remote sensing aboard small, commercial unmanned aircraft systems. "Hyperspectral represents a crucial payload for any manned or unmanned deployment," noted Headwall CEO David Bannon. "But significantly notable is that the UAS has become a 'go-to' platform. This means not only smaller and lighter sensors, but also integrated solutions that factor in everything from LiDAR and data-management to post-processing tasks such as ortho-rectification that our software can handle." Because the Micro-Hyperspec sensor uses high efficiency diffraction gratings in a concentric, optical design, imaging performance and signal-to-noise are both maximized. The patented optical design provides a package that is rugged and robust for airborne use in harsh environments such as the Arctic ocean.

 

The Observatory for Air-Sea Interaction Studies (OASIS) 

Led by Professor Christopher Zappa, the Observatory for Air-Sea Interaction Studies (OASIS) conducts research in a variety of fields focused on the oceanic and atmospheric boundary layers. These include wave dynamics and wave breaking, air-sea CO2 gas exchange, non-satellite remote sensing and boundary-layer processes. Affiliated with the Lamont-Doherty Earth Observatory (LDEO) and Columbia University, OASIS is involved in joint projects with the Polar Geophysics Group of LDEO, Yale University, the University of Heidelberg, the University of Connecticut, and the University of New South Wales and participated in various large multi-institution projects such as CBLAST-Low, GasEx, VOCALs, RaDyO, DYNAMO.  

The group develops and deploys instruments including infrared, multispectral, and polarimetric cameras on different fixed and mobile platforms such as ships, aircrafts, buoys. The study areas range from laboratory wind-wave tanks, Biosphere2, to local rivers and estuaries, to shelf seas and polynyas, to open ocean from the poles to the equator.


For information contact:

Professor Christopher J. Zappa, Lamont Research Professor 

Lamont-Doherty Earth Observatory 

[email protected]

Tags: hyperspectral imaging, Airborne, Remote Sensing, Micro Hyperspec, UAS

Smaller, Lighter, Better: Hyperspectral on UAVs

Posted by Christopher Van Veen on Fri, Aug 08, 2014

At Headwall we've been busy listening to the market. When it comes to airborne remote sensing, the market is telling us that they favor UAVs (unmanned aerial vehicles) of all kinds: fixed-wing, multi-rotor, and so on. There's no end to the number of companies producing UAVs globally. Because many UAVs produced today are very small and affordable they are 'within reach' of those with even modest means. Universities represent one key market where the use of UAVs is rapidly increasing. Full of scientists and research departments, universities around the globe see these small and light UAVs as a perfect platform from which to launch their exploratory studies. They are affordable, easy to assembly and transport, and (especially with multi-rotor models) can take off and land within a very small footprint.

UAV with NanoBut alongside all this enthusiasm for UAVs, there are many who frown upon these airborne vehicles and see them as a nuisance. Indeed, they can be a nuisance when used for trivial pursuits. In densely-populated areas they certainly can be more than an annoyance...they can be dangerous. But largely, the work we are seeing our customers undertake with hyperspectral imagers attached to UAVs is very valuable work indeed. And it takes place far from the hustle and bustle of any urban landscape. For example, precision agriculture is made more valuable because there are key indices to plant health and physiology that are readily seen from above than from below. Certain disease conditions are ‘visible’ using hyperspectral imaging, especially with high spectral and spatial resolution found on all Headwall sensors.  Other research pursuits include environmental analysis, geology, pollution analysis, and so many more. These are very good and valuable scientific efforts made moreso by the UAVs that enable these precision instruments to 'fly.' The marriage between hyperspectral and UAV seems to be a perfect one, especially when you consider how much ground can be covered with one of these flying wizards. And especially when you realize that hyperspectral imaging fundamentally requires that movement needs to occur. In other words, hyperspectral was meant for airborne deployment. Where a Jeep can’t go, a UAV can. And furthermore, more ground can be covered with a UAV, meaning more efficient data collection over rugged and inaccessible landscapes.

Nano-HyperspecAs UAVs get smaller and lighter, users run headlong into the issue of payload: UAVs are limited with respect to what they can lift. Whatever else a UAV is asked to carry, it needs to lift batteries. Then comes the instrumentation. Headwall’s Nano-Hyperspec was just introduced for the VNIR (400-1000nm) spectral range. Most (but not all) of the things a research scientist might wish to ‘see’ are visible in this spectral range. But we did a couple things with Nano-Hyperspec that helps the payload issue. First, the size and weight are well below previous sensor offerings. Its size (including lens) is a scant 3” x 3” x 4.72” (76.2mm x 76.2mm x 119.2mm), and its weight is less that 1.5 lb. (0.68kg). Best of all, this includes on-board data storage of 480GB. That’s about 130 minutes at 100fps.

Aside from making Nano-Hyperspec smaller and lighter than other hyperspectral sensors, a key differentiator comes from embedding the data storage within the enclosure while providing multiple attach points for the GPS/INU. Another key attribute is the inclusion of the full airborne version of Headwall’s Hyperspec III software, which includes a polygon flight tool for sensor operation and a real-time Ethernet Waterfall display. While the work to shrink the size and weight of Nano-Hyperspec is valuable by itself, it does allow the user more room and available payload to carry other instrumentation. Hyperspectral combined with LiDAR and thermal imaging is an extremely valuable package that is made possible thanks to the overall size/weight reduction of Nano-Hyperspec and the embedding of the data storage/management capabilities (which were contained within a separate enclosure previously).

Hyperspec III software gives users full control over data acquisition, sensor operation, and datacube creation in ENVI-compatible format. Hyperspec III also works in full conjunction with the GPS that can be paired with the sensor as an available Airborne Package. In this optional package, customers are able to take advantage of real-time computation of inertial enhanced position/velocity, ~161dBm tracking sensitivity, accurate 360-degree #D orientation output of attitude and heading, correlation of image data to GPS data, and much more. During post-processing, the Airborne Package also effortlessly handles radiometric calibration and conversion as well as orthorectification.

 

 


Tags: Airborne, Remote Sensing, UAV, agriculture, precision agriculture

Hyperspectral Takes Wing Over Ontario!

Posted by Christopher Van Veen on Thu, May 01, 2014

UASUnder cloudless skies in Ontario recently, Headwall achieved a very notable milestone: we became the first to fly both hyperspectral and LiDAR aboard a small, fully integrated handheld UAS. The test flights not only verified the reliable airworthiness of the system but also the ability to collect valuable hyperspectral and LiDAR data in real time.

Integration is key, because all of this specialized data-collecting instrumentation needs to fit the payload parameters with respect to size and weight. With UAS systems shrinking in size and weight, payloads need to follow suit. As prime contractor for this complete airborne system, Headwall is able to get end-users up and running quicker than ever. Time to deployment is reduced by months thanks to the work Headwall is doing to engineer optimized solutions that meet specific remote-sensing needs.

“The variety of applications for this type of integrated airborne system are numerous,” said Headwall CEO David Bannon. “Precision agriculture is a key one we’re seeing on a global scale, but geology, pipeline inspection, environmental research, pollution analysis are others.” Today’s UAS is smaller, lighter, and more affordable than ever, which makes it a perfect platform from which to carry precise imaging instruments such as hyperspectral and LiDAR. “We’ve always been a pioneer in the area of small hyperspectral sensors for just these kind of deployments,” noted Bannon. “Our strength comes from understanding what our users want to do and then engineering a complete airborne solution that meets that need.”

Chris Van Veen, marketing manager at Headwall, was on site to record and document the test flights. “A fully integrated package like this represents a new frontier for remote-sensing scientists who now have an airborne research platform that goes wherever they do,” says Chris. “Watching this fly and collect data in Canada was a thrill because it was visible testimony to all our integration work.”

The entire payload aboard this particular UAS is less than ten pounds, which includes hyperspectral, GPS/IMU, LiDAR, and computing hardware. Besides making sure these elements are small and light enough, the challenge of integrating everything with an eye toward battery lifetime is also Headwall’s to manage. “We know our remote-sensing users have very important work to do, and they need sufficient power not only to fly but also to operate the instruments,” said Bannon. One way to meet this challenge head-on is to make sure the hyperspectral sensor provides a very wide field of view with precise imagery from one edge to the other. “If you can assure outstanding image-collection across a wide field of view, and then provide orthorectification of that data, you’re covering more ground for each flight swath.”

Fundamental to accomplishing this is Headwall’s approach to optics, which is both simple and elegant. “Our diffractive optics approach uses no moving parts, which, in an airborne application, means robustness and reliability,” said Bannon. Inside each Micro-Hyperspec sensor is a precise and small holographic diffraction grating that manages incoming light with exceptional fidelity. These sensors are ‘tuned’ for the spectral range of interest to the user. “Depending on what the user wants to ‘see,’ he may need a VNIR sensor that operates from 380-1000 nanometers,” said Bannon. The spectral signature of a certain disease condition on a crop tree will determine the spectral range of the sensor, for example. Headwall has also introduced a wideband VNIR-SWIR sensor package that covers from 400-2500 nanometers. This co-registered hyperspectral instrument will be very popular with users who need broad coverage but need a small, light, and affordable instrument to do it with.

The following video will give you a peek into how flight testing went in Ontario.

Tags: hyperspectral, Airborne, Remote Sensing, UAS, UAV, agriculture

UAVs and Hyperspectral Imaging Unite

Posted by Christopher Van Veen on Tue, Mar 25, 2014

One of the things we’re seeing at Headwall is the proliferation of airborne applications. Multispectral suffers a bit with respect to hyperspectral (a handful of bands versus hundreds), which is why hyperspectral is winning the day.

UAV choicesOne reason is instrument affordability. Multi-million-dollar hyperspectral sensor programs might have flown (literally and figuratively) in the military world, but not in precision agriculture or with universities. Budgets are smaller, and that money has to be spread among not only the sensor but the UAV and everything in between. This is where small, entrepreneurial companies like Headwall shine, because everything in between can mean LiDAR, GPS/IMU technology, application software, data processing, and so much more. We understand hyperspectral imaging better than anyone, and our focus has always been to better that technology while driving costs lower. This is the essence of commercial-off-the-shelf (COTS), where highly specialized military instrumentation finds a home all across industry and academia. With respect to Headwall, COTS implementation means smaller, lighter and more affordable sensors that are easier to use yet just as optically precise as their multimillion-dollar military counterparts.

Second, you cannot go a day without seeing stories about UAVs. Fixed-wing designs like those from AGX and PrecisionHawk are crowding the skies along with multi-rotor helicopters like Infinite JIB and AIBOTIX. These are much more than hobbyist playthings and are perfect for scientific reasearch duties. They have excellent range and payload-carrying characteristics, and they are stable aloft.  From mineral exploration and agriculture to petroleum and pollution control, UAVs are everywhere it seems. And everyone takes notice when household names like Facebook, Google and Amazon decide that the UAV is going to be instrumental to their future success. Much of this might sound fanciful and far-off, but it is happening now. Court challenges are being won, and while care needs to be taken on how regulations are drafted and enforced, no one doubts that the UAV is not only here to stay but will become commonplace.

Obviously, UAVs simply take up airspace unless they are doing good work. And largely, we seem to hear about bad things happening when mention of UAVs (and drones) is made. But stop and consider for a moment how a famine-stricken area can be made crop-fertile thanks to hyperspectral data that a UAV-mounted sensor can collect. A scientist will know about disease conditions with enough time to prevent damage by skimming the treetops and looking for anomalies that become ‘visible’ through hyperspectral imaging. A farmer will know where to plant and harvest…and where not to. Crop stress will be seen long before it becomes a worry, and the amount of wholesome and nourishing food planted in areas once thought impossible will blossom. In short, small and light UAVs are affordable for the people who need to use them. They can be flown in areas that vehicles and humans cannot yet reach, providing a window of research never available to scientists before.

As we see the proliferation of UAVs capable of carrying sensor payloads, it is important to understand how everything goes together. Here, Headwall is taking a leading role. Many mistakenly believe that slapping a sensor onto an octo-copter is all they need to do. But making sure everything works the way it should aboard a flying, unmanned vehicle is another challenge altogether. How much ground do you need to cover, and do you have enough battery power to do it? How much hyperspectral data do you need to collect, and do you have the computing and storage horsepower to make that happen? What are you looking for, and what spectral ranges are those things in? How do you ortho-rectify the data during post-processing? And how do you use the science of ground-truth as it relates to airborne hyperspectral imaging? This last consideration is hugely important, because the collaboration of airborne hyperspectral and ground-truth delivers the best possible accumulation of data. Headwall and ASD have even authored a 12-page whitepaper on the relationship between airborne hyperspectral data and ground-truth techniques.

Tags: hyperspectral imaging, Airborne, Remote Sensing, UAV, precision agriculture

Headwall's Field-of-View Calculator

Posted by Christopher Van Veen on Mon, Mar 17, 2014

When it comes to hyperspectral imaging, it isn’t always about the hardware. Before users even get to the stage of specifying a sensor instrument, they need to ask a few questions:

  • What do I want to look at?
  • How am I deploying the sensor?
  • What is the spectral range of what I’m looking at?
  • How far from the object will I be?

The answers to these questions will lead to an informed decision about the kind of sensor that’s best, the kind of lens it will need, and how small and light the sensor needs to be.  At Headwall, we’re helping customers sort through these questions and considerations every day. We make on-line tools available that make instrument specification easy. With the answers to a few simple questions, the overall application-specific design of a hyperspectral instrument is well within reach. This means quicker time-to-deploy for customers who have challenging scientific questions that need answers.

One of Headwall’s newest tools is the Field-of-View (FOV) calculator. This tool collects a few important user-defined parameters to arrive at several what-if scenarios. The first parameter is distance from lens to object. In an airborne application, the distance would likely be measured in meters. For lab-based or in-line deployment, it might only be centimeters.  The second parameter is the wavelength, which can be UV-VIS (380-825nm) all the way up to SWIR (950-2500 nm). Knowing the spectral signature of the item of interest will point you in the correct direction.

FOV resized 600

The calculator will take this information and combine it with choice of sensor and lens to arrive at useful data for the customer. In this case, we see that for the parameters and options chosen we are given the number of spatial and spectral channels (1004 and 335 respectively). We’re also given the linear and angular FOV, the instantaneous FOV, and the spectral resolution. In an airborne application, the linear FOV can be thought of as the flight swath. The wider the better, because the aircraft or UAV will be able to collect full hyperspectral information with fewer passes over the ground.

Output

Spectral libraries are common starting points for defining where to look along the spectral range. The spectral signature for everything from plants and crops to minerals and petroleum is known or catalogued.  While everything has its own signature, the real strength of hyperspectral imaging is to discriminate and classify. So while the sensor can actually ‘see’ everything, it is tuned to look for things that may resonate at 900 nm or 1900 nm for example. A disease condition on a fruit tree may be impossible to detect by any visible means, but it will resonate quite clearly when seen with a hyperspectral sensor.

Customers come to Headwall regularly with certain ‘needs.’ A crop scientist may want to analyze the soil from an airborne UAV. Another may want to adopt hyperspectral imaging along a high-speed food processing line to see and remove foreign matter. A third may be a museum preservationist interested in understanding the artwork and artifacts under their care. But in all cases, the first question is: What do you want to see?

 

Tags: hyperspectral imaging, Headwall Photonics, Airborne, Sensors

Hyperspectral Sensors for UAV Applications

Posted by Christopher Van Veen on Wed, Feb 19, 2014

The scientific research community is beginning to understand and embrace hyperspectral imaging as a useful tool for a few primary reasons. First, sensors are more affordable than ever. Originally conceived as multi-million-dollar ISR platforms for defense applications, hyperspectral imagers have been successfully ‘commercialized’ over the past few years. Scientists typically embracing RGB or multispectral technology before can now acquire hyperspectral sensors at affordable price points.

Hyperspectral sensors of the ‘pushbroom’ type produced by Headwall require motion to occur. That is, either the sensor flies above the field of view, or the field of view moves beneath the sensor. For UAV applications, Headwall’s small and lightweight Micro-Hyperspec is the platform of choice. Available in the VNIR (380-1000nm), NIR (900-1700nm), and SWIR (950-2500nm) spectral ranges, the sensor is truly ‘SWaP-friendly.’

Spectral range is often where the decision-making starts. The chemical fingerprint—or spectral signature—of anything within the field of view will lead the user in one direction or another. For example, a certain disease condition on a tree canopy may become ‘visible’ within the SWIR spectral range (950-2500nm). Similarly, a certain mineral deposit may become ‘visible’ in the VNIR range (380-1000nm). One approach to ensuring the spectral ‘fidelity’ of images collected by the sensor makes use of ‘diffractive optics’ comprising aberration-corrected holographic gratings. This ‘Aberration-corrected concentric’ design is shown below.

concentric imager

There are several advantages to this ‘reflective’ approach. First, the design is simple, temperature insensitive, and uses no moving parts. This assures robustness and reliability in airborne situations. Second, diffraction gratings can be made very small so that the instruments themselves can be small and light; in other words, capable of fitting the new class of lightweight, hand-launched UAVs. Third, the design optimizes technical characteristics that are most important: low distortion for high spatial and spectral resolution; high throughput for high signal-to-noise; and a tall slit for a wide field-of-view. Because the design is an all-reflective one, chromatic dispersion is eliminated and excellent focus is assured across the entire spectral range.

Many within the environmental research community and across ‘precision agriculture’ prefer to use UAVs as their primary airborne platform. They are more affordable than fixed-wing aircraft and easy to launch. But as UAVs get smaller and lighter, so must the payloads they carry. And integrating the sensor into the airframe along with other necessities such as LiDAR, power management/data collection hardware, and cabling can be a daunting task (Figure 3). Orthorectification of the collected data is another key requirement, which is the means by which the hyperspectral data cube is ‘managed’ into useful information that has been ‘corrected’ for any airborne anomalies. In other words, the collected hyperspectral data needs to be ‘true’ to what’s actually within the field of view.

 Micro Hyperspec

Acquiring a UAV and a hyperspectral sensor won’t assure compatible performance, and a high level of ‘integration work’ is needed. The UAV community and the hyperspectral sensor community are both challenged with pulling everything together. Recognizing this, Headwall Photonics is taking an industry-leading position as a supplier of fully integrated airborne solutions comprising the UAV, the sensor, the power and data management solution, cabling, and application software. The result is that users are flying sooner and collecting better hyperspectral data than ever before.

Type of UAV is very often one of the first decisions a scientist will need to make. Fixed-wing and multi-rotor are the two general categories, with numerous styles and designs within each. In-flight stability and flight-time duration are both paramount concerns, and this is where payload restrictions will often point toward one or the other. Multi-rotor UAVs launch and land vertically, so this type will be favored in situations where space is tight. Conversely, a fixed-wing UAV requires suitable space to launch and land but can provide longer flight duration and carry a heavier payload. The wide field-of-view characteristic of the concentric imager allows a UAV to ‘see’ more ground along its flight path.

Integrated airborne package

Two other key areas managed through Headwall’s integrative process are data management and application software. While a separate subsystem is used to control the sensor operation and store the hyperspectral data, the direction is clearly toward on-board integration of these capabilities. Flash storage and solid-state drives will soon make it possible for the sensor to ‘contain’ all the related functionality that now needs to be contained in a separate module. This will clearly lighten the overall payload, reduce battery consumption, and boost airborne flight time.

Headwall’s Hyperspec III software represents a complete, modularized approach to the management of hyperspectral data. Orthorectification is one such module within the software suite that removes the unwanted effects airborne behavior. The resultant orthorectified images have a constant scale wherein features are represented in their 'true' positions. This allows for the accurate direct measurement of distances, angles, and areas. Other aspects of the software suite can be used to control GPS/IMU devices, control multiple sensors simultaneously, and save polygons (A Google-map-enabled tool that allows the user to define geographic coordinates).

 

 

Tags: hyperspectral imaging, hyperspectral, Airborne, Remote Sensing, Micro Hyperspec, agriculture, diffraction gratings, precision agriculture

Headwall Remote Sensing Capabilities Seen “Down Under”

Posted by David Bannon on Wed, Jul 31, 2013

melbourneThis past week, Headwall remote sensing team finished a productive week Down Under at the International Geoscience and Remote Sensing Symposium (IGARSS) in Melbourne, Australia.  The conference, organized by the IEEE, comprises a ‘Who’s Who’ across the global remote sensing community. But curiously absent were representatives from the United States, probably reflecting the topic du jour: sequestration. Imagine holding a geo-spatial and remote sensing conference and no one from NASA was able to attend?

From an international perspective, we observed tremendous interest from customers looking to gain spectral capability for their manned aircraft and also surprising interest from organizations looking to buy “all-inclusive” UAV configurations that include the Micro-Hyperspec imaging spectrometer, a GPS/INS unit, a lightweight IGARSS 2013 Boothembedded processor, and an suite of application software. This complete airborne package was a big hit at IGARSS because while users have good grasp on the benefits of airborne hyperspectral, they need help making it work in particular application.  Two very nice UAVs on display at IGARSS created a lot of buzz in the Headwall booth. Although Headwall doesn’t make the UAV platform, we make them do some pretty amazing things within the realm of hyperspectral remote sensing. That message came through loud and clear, as our stand at IGARSS was phenomenally busy from the start right through the end.

A bit further up in altitude were visitors interested in hyperspectral remote sensing from space. A major point of interest throughout the conference was a demonstrated need for cost effective, space-qualified hyperspectral sensor payloads.  With most of the world’s planned remote sensing missions being delayed for budget reasons, VNIR (380-1000nm) and SWIR (900-2500nm) space-qualified imagers are hot commodities. This is an area that Headwall Great Ocean Roaddeveloped over the last five years with its own space-qualified sensor payloads.  There was also strong focus from attendees on how satellite collaboration could be established among the world’s most notable remote sensing programs.  Japan’s ALOS-3 (2016 launch?), European ENMAP (2017 launch?), and NASA HYSPIRI mission (2023 launch?) represent three of several.

Even with all the activity at IGARSS, Headwall’s remote sensing team led by Kevin Didona, Principal Engineer at Headwall, also took some hyperspectral scans of rock wall formations at some very scenic places along the Great Ocean Road on the South Coast of Australia.

As Headwall has developed extensive experience in the application of hyperspectral sensors specifically designed for UAVs, please drop us a line or give is a call if we can provide some information to meet the objectives of your remote sensing research.

Email us at [email protected]

Visit us at www.HeadwallPhotonics.com

Or call us at Tel: +1 978 353 4003


Tags: hyperspectral imaging, hyperspectral, Headwall Photonics, Airborne, Remote Sensing, Sensors, Micro Hyperspec, UAS, SWIR, Sensing, VNIR, Satellites, UAV