Headwall Photonics Blog

Hyperspec SWIR Gets UAV Wings!

Posted by Christopher Van Veen on Thu, Sep 14, 2017

Remote sensing with hyperspectral sensors is  a combination of several elements: an imaging spectrometer and a fast data processing system to acquire and analyze spectral and spatial data. For remote sensing missions such as crop disease or invasive species detection, the spectral range of most interest is the visible-near-infrared (VNIR) from 400-1000nm.

Geologic exploration requires a different spectral region as spectral signatures of interest are evident in the shortwave-infrared (SWIR) range of 900-2500nm. Each type of mineral has its own unique spectral signature reflecting its chemical composition. Since certain minerals reflect light slightly differently, it's important to have many contiguous spectral bands.  

Given the sensor requirements for generating SWIR hyperspectral data, spectral imaging sensor systems require efficient consideration of size, weight, and power requirements for deployment on UAVs. Headwall has worked strategically to 'lightweight' its Hyperspec sensors for field deployment. One of the main design aspects is that Headwall uses an 'all-reflective' approach which allows for a very small instrument size and form factor. The Hyperspec sensors are smaller and lighter but also more robust in airborne situations, particularly for UAV deployment. 

DJI-with-SWIR-small.jpg

Headwall's Micro-Hyperspec SWIR is being successfully deployed on the multi-rotor UAVs with excellent stability and imaging results. What's more, the entire payload also includes a high-performance GPS/IMU and Headwall's HyperCore data-fusion hub that synthesizes the data streams coming from complementary instruments. For example, LiDAR can also be integrated to the payload and handled by HyperCore.  

There are several aspects of Headwall’s spectral imaging sensor design that are advantageous for geological research. “When you’re using a UAV, flight time is crucial,” said Peter Clemens, Director of Engineering at Headwall. “Efficiently capturing wide swaths of valuable image data on each flight is therefore a huge advantage.” Headwall has built ortho-rectification into its Hyperspec spectral software allowing users to generate highly accurate geospatial data. “We give our customers the technology to identify not only what they see within the scene but a precise location as to where it is on the ground,” continued Clemens. Headwall has developed solution bundles that package both hardware and software systems necessary so that customers can quickly acquire spectral data but also efficiently process this data and view spectral maps.

Tags: hyperspectral imaging, Airborne, Remote Sensing, geology

The Eyes Have It...But Not Always

Posted by Christopher Van Veen on Thu, Apr 13, 2017

Humans have a marvelous ability to see and identify objects within what is called the visible range of the electromagnetic spectrum. That starts at roughly 380 nanometers and goes up to around 700 nanometers or so.

But there are things that researchers and scientists might wish to 'see' that fall below (ultraviolet) or above (infrared) this 'visible' portion of the spectrum. If you were a bumblebee, you could see into the UV range; a rattlesnake, you could see into the infrared range. Obviously (and thankfully) we're neither, but to see into these other ranges we need help. And why would we care about anything our eyes cannot see? Well, to take just precision agriculture as a key example, there are vegetative indices (VI's) that depend on seeing into the infrared ranges where the spectral signatures of chlorophyll fluorescence are detectable. Chlorophyll fluorescence is predictive of crop stress and vigor, so being able to see and quantify its effects can tell crop scientists much more than their own eyes can. 

DJI-MATRICE-600.jpg

Since we aren't bumblebees or snakes, we need tools to see into the nether regions of the spectral range. Hyperspectral and multispectral sensors do the work our eyes can't, and they do it very well. They collect, in the case of hyperspectral, a full spectrum of image data for every pixel within the field of view. Dozens of vegetative indices exist, with each using spectral data to discern answers to questions: Are there diseases on my crops I cannot see? Is my soil nutrient-rich? Are there invasive species I need to worry about? In the end, scientists concern themselves with finding answers to these and other questions rather than poring over complicated hyperspectral data cubes. In layman terms, you go to Home Depot not to buy a drill; you go there to buy a hole.

Hyperspectral and multispectral differ with respect to the amount of image data being collected. Hyperspectral is to multispectral what hundreds of bands are to a handful. With multispectral, you also may have gaps between the rather wide bands and what you want to detect with the sensor might not register. But hyperspectral represents hundreds of narrow and contiguous spectral bands, so if a certain spectral signature is there you'll see it. There are places for both multispectral and hyperspectral; if you know the spectral signature of the vegetative indices of interest and you're sure the multispectral sensor can capture it, you're all set. But much more common is the case where scientists do not exactly know where along the electromagnetic spectrum a key VI exists. Is it somewhere between 400 and 1000 nanometers (nm), which we call the visible-near-infrared VNIR range? Or is it in further up, between 900-2500nm (the shortwave-infrared SWIR range)? Indeed, missions may change over the course of the instrument's life, which means that scientists would opt for a combined VNIR-SWIR sensor capturing image data from 400 nm all the way up to 2500 nm.

Hyperspectral and multispectral imaging sensors are often 'line-scan' instruments, basically meaning they capture image data a slice at a time. The composition of all these slices (or frames) is a hyperspectral data cube, which can be several gigabytes in size. Post-processing software is very good at unscrambling this complex cube of data into meaningful answers, but just as important is aircraft stability. Since UAVs are quickly becoming the 'go-to' platform for crop scientists and others, making sure the craft is stable in the air is fundamental to making sure the data is orthorectified. In other words, not a casualty of a wobbling UAV. Fortunately, stabilized gimbals are outstanding nowadays, having the immediate ability to keep the sensor in its desired position no matter what the craft does.

Obviously, a UAV-based remote sensing system is a function of optics, electrics, and aerodynamics. Integration is an overlooked task, because many users assume that they can buy a UAV and a sensor and bolt the two together. Unfortunately, experience shows that such a piecemeal a-la-carte endeavor is likely to fail. Battery life comes into play, balance rears its head, and understanding the relationship between frame rate and ground speed can flummox anyone. Fortunately, though, companies like Headwall Photonics exist to manage this integration process. They understand a thing or two because they've seen a thing or two. They can recommend the right kind of UAV, take size/weight/power (SWaP) into consideration, integrate spectral sensors with other instruments such as LiDAR, and deliver turnkey, flight-ready packages that even bumblebees and snakes would have to admire. 

Tags: hyperspectral imaging, Airborne, Remote Sensing, Nano-Hyperspec

Nano-Hyperspec, PrecisionHawk Impress!

Posted by Christopher Van Veen on Wed, Jul 06, 2016

Steven Sexton is Technical Consultant at Aerial Imaging Services, LLC (Ephrata, WA). With broad availability of new UAVs and high-performance hyperspectral imaging sensors, Steven's business is a good one. 'Remote sensing' is the study of agriculture, climatology, geology, and infrastructure from airborne platforms. The amount and quality of image data the sensors collect is amazing, allowing scientists to make important decisions about crops, plant health, mineral deposits, and environmental trends.

Recently, Steven teamed up with Precision Hawk (Raleigh, NC) and Headwall Photonics to put one of these 'flying laboratories' into the air. Because the combination of UAVs and specialized sensing instruments is still 'new' to many users, ease of integration and great customer support from Precision Hawk and Headwall allowed Steven to get into the air collecting data-rich images of the ground below. Precision Hawk took care of many of the airborne issues while Headwall addressed the hyperspectral side of the application. Together, both companies helped Aerial Imaging Services reach a very impressive level of differentiation in a still-emerging business. The myriad of mechanical, electrical, optical, and aerodynamic considerations can be daunting, and Steven took to LinkedIn on June 28, 2016 to tell his story:

*********************************************************************************

I am going to shift focus to sensors today. I recently acquired a Nano-Hyperspec Sensor from Headwall Photonics and PrecisionHawk. This sensor is absolutely amazing, and is configured to just plug right onto the Lancaster Rev 4 and the Lancaster 5. This Plug and Play setup is how all of the sensors PrecisionHawk sells. Making it extremely easy to do a visual scan, land, change to a  BGNIR sensor and fly. They sensors scan at much higher resolutions then most multispectral sensors. Around here being at a higher altitude means not running into the trees lining fields, silos, buildings etc. I get 1.5cm per pixel at 100m or 329 feet AGL. I go lower for Lidar and thermal to 60m or 196 feet AGL.

Now this may seem a little high but with the higher resolution sensors I got from PrecisionHawk, it just made sense, less worry and more time to just watch the Lancaster do its thing. It also means I don't have to make as many passes over a field like I would at lower altitudes. Now I haven't heard anyone say the resolution of some of the newer multispectral sensors that recently have come out or those that I may not know of. If you use one of these, and it gets as high of resolution, please, either leave a comment and share your results or message me so I can add that information to this article. I want to give everyone a fair shake here.

steve_sexton.jpgNow back to the Headwall Photonics Nano-Hyperspec® sensor. This unit is a little heavier than most of my other sensors, The LiDaR is about as heavy. The reason for the extra weight is a 500gb SSD drive attached to it. It also has a network cable interface to hook to your computer or laptop. Please read the manuals that come with it, it will save you a lot of headaches trying to figure out how to access the SSD on the sensor. You can find general information at the Headwall site here.

The customer service from Headwall is absolutely amazing. I decided to update the Nano driver software and missed one step and wound up not being able to access the data. Now this was totally my fault, I kind of went in blind to do the update.

Greg Chenevert from Headwall was extremely helpful and had me try a few things. These didn't work, but Greg spent time trying to help me get things going and guaranteed that they would get it back up. He put me in contact with one of the companies programmers, I gave him remote access to my system with the Nano hooked up and running and he had it all set up, reconfigured and doing imaging within probably 15 minutes at the most. Now I don't know about the rest of you, but most companies don't even come close to the customer service of Headwall and PrecisionHawk. They went way above and beyond to get the sensor working so I could do my job.

The unit itself is not that big, it is the bracket and connectors for the plug and play that make it seem larger. On average I can swap out a sensor and battery in about 30 to 45 seconds. The battery only lasts for about 35 minutes with the heavier load, but I get some pretty amazing images with it.

Headwall has software that accompanies the sensor that are very useful and allow you to transfer the files to a local drive on your computer or laptop. You could even transfer it to a USB drive if you have one that can hold the amount of data you get. There is also an option to view the data in NDVI, now this can be done in the field if you so desire. I usually just bring it back to the office and process it there and add it to other data sets I have gathered on that particular job. It does make the farmer happier if you can show it in the field.

Lancaster.jpgWhen I first started I was unsure of which sensors I should purchase. I imagine several of you have or had the same issue. I determined that if I only get the sensors for agriculture then I am going to be very poor during the winter months. I decided to add the LiDaR, thermal, and the HeadWall Nano-Hyperspectral sensor. This gives me the ability to do other types of work during the non growing season. I also don't mind travelling to a location or even going to another area for several weeks at a time so this also opened up income opportunities.

The data is only as good as your sensors are. Sure the higher quality imagery costs a bit more, but, it also means the data is going to be more precise. Combined with the DataMapper Algorithms you get a very complete package from one source.  

Tags: Remote Sensing, UAV, precision agriculture, Nano-Hyperspec, PrecisionHawk

Landmine Detection Using Hyperspectral Imaging

Posted by Christopher Van Veen on Wed, Apr 20, 2016

 

 

When you see how rapidly the use of drones for scientific research has risen, you first conclude that it's all about precision agriculture and climatology. To be sure, those are trendsetting applications when it comes to using hyperspectral imaging sensors aboard UAVs. But over at the University of Bristol, two scientists are leading a team focused on 'finding a better way' to detect the presence of landmines that kill or maim thousands of people annually. With around 100 million landmines underneath the ground globally, traditional means of finding and eliminating them would take about 1,000 years and cost upwards of $30 billion according to some estimates.

Find A Better Way is a UK-based NFP founded by famed footballer Sir Bobby Charlton. Despite his heroic sporting achievements, Sir Bobby is now forging a legacy outside of football through his determination to champion the cause of landmine detection and elimination. He witnessed the destruction caused by landmines on visits to Cambodia and Bosnia as a Laureus Sport for Good Ambassador. He founded Find A Better Way after recognizing that research and development held the key to making the major changes necessary to allow humanitarian teams to rid the world of the threat of landmines.

"We want to do something that very quickly delivers a step-change in capability while reducing overall human risk involved with finding and eliminating landmines," said Dr. Tom Scott of the University of Bristol. He along with Dr. John Day are pairing their advanced UAV with small and lightweight hyperspectral imaging sensors from Headwall Photonics to 'see' with a specificity and resolution unheard of only a few short years ago. "These drones can be autonomously deployed to fly over a landmine area and provide high-resolution images that allow us to reconstruct the 3-D terrain with very high accuracy," said Dr. Scott. With all the landmines across the world, tactical deployment of numerous low-flying drones is going to win the day over expensive satellites or high-flying aircraft. In order to meet this objective, the package needs to be simultaneously affordable, light, and suited to its mission. There is a vast amount of integration and testing work involved before the first meaningful flights can be flown. Recognizing this, Headwall is assuming much of this work so that users can compress this time-to-deployment significantly. Because the use of drones for scientific research is still in its infancy, misconceptions abound. Acquiring a UAV and slapping a hyperspectral sensor to it without first considering all the variables is a recipe for disaster. This holds true whether the mission is landmine detection or precision agriculture. More commonly, other instruments such as LiDAR and GPS are part of the payload as well. The end result is a carefully balanced exercise in aerodynamics, optics, electrics, and data-collection that companies such as Headwall are able to manage.

The human eye can only respond to wavelengths between roughly 390-700nm. Many of the reflective 'signatures' given off by plants and chemicals fall outside that range. For example, the Nano-Hyperspec sensor used by the Bristol team operates in what is called the 'Visible-Near-Infrared' range of 400-1000nm (called 'VNIR' for short). In that range, the sensor is 'seeing' with an extraordinarily high degree of specificity and resolution and far beyond what a human could discern. Indeed, these sensors are collecting an astounding amount of spectral data on a per-pixel basis, resulting in 'data cubes' many Gigabytes in size. Armed with spectral libraries that faithfully characterize the specifics of the terrain below, scientists can match the known library information with the collected airborne data and make quite accurate calls on what's what.

The use of drones has accelerated this scientific effort because of two factors: First, they are affordable and easy to deploy on a tactical basis. One can be packed in a Range Rover and deployed almost anywhere in minutes whereas aircraft and satellites are, by nature, constricted, inflexible, and costly. This is not to say that aircraft and satellites will be supplanted by UAVs. There is valuable image data that can be collected using these high-flying platforms, and the overall knowledge base shared by the scientific community is made much more complete when all these assets are used in a synergistic fashion. Indeed, hyperspectral imaging was once the province of high-flying reconnaissance planes and satellites...neither of which could ever be used economically by university scientists. But the ubiquitous drone--a bane to some and a blessing to others--is the perfect platform from which to launch these exploratory efforts. "We're adding a bit more science to the UAV payload now," says Dr. Day. "We're starting to look at the spectrum of light and the colors of light that are coming off the minefield and using that data to find where the landmines are."

UAVs and drones seem to get media attention for all the wrong reasons, which is exactly why efforts by the esteemed team at the University of Bristol are to be applauded for developing a 'Better Way' to solve some of our toughest challenges. Hyperspectral imaging sensors can 'see' even beyond the VNIR range of interest to Dr. Day and Dr. Scott. The Shortwave-Infrared (SWIR) range starts near where VNIR leaves off, covering around 950-2500nm. The presence of certain chemicals, minerals and of course plant photosynthesis will become visible to sensors like these. Indeed, a broadband sensor package that covers the VNIR and SWIR range (400-2500nm) is particularly useful because it basically collects everything a scientific research effort might wish to see.

There are two key factors about hyperspectral imaging that are worth noting. The first is that the technology depends on 'reflected light.' The sensor is basically looking at how sunlight reflects off certain materials. Plant fluorescence, for example, has a particular 'spectral signature' that a sensor can understand. Obviously, this means an airborne hyperspectral sensor depends on a healthy amount of solar illumination and certainly is useless at night. But Headwall's sensors are designed to collect precise image data even under less-than-ideal solar conditions (cloud cover, or low angles, for example). The second factor is having a wide field of view. The sensor obviously can 'see' directly beneath the line of flight, but being able to do so off to the wide edges of the flight pattern makes the mission more efficient. Batteries being what they are, optimizing the flight duration by capturing a wide swath of land is obviously beneficial. This benefit is seen in the precise optical layout used by Headwall in the construction of each sensor.

Crop science, climatology, geology, and even the inspection of infrastructure such as pipelines and rail bed depend on imaging sensors like those produced by Headwall. Hyperspectral sensors depend on 'motion,' since they basically collect images slice by slice as the UAV flies over the scene. The combination of all of these high-resolution 'slices' comprise what is known as a 'data cube,' which is pored over by scientists during post-processing. Of course, the hardware capturing these images represents about half the story. The other half can be found in the software that makes sense out of reams of spectral image data that all needs to be 'geo-tagged' and orthorectified. First and foremost, scientists need answers; the data (and the sensor collecting the data) are simply means to an end. When you go to your local DIY or Lowe's or Home Depot, you really aren't buying a drill; you're going there to buy a hole.

But is all that image data really needed? Some efforts seek to cut corners by using less-capable 'multispectral' sensors that cover only a few bands rather than the hundreds of bands covered with hyperspectral. Using crop science as an example, a multispectral sensor might miss the telltale signature of an invasive disease on a tree canopy while hyperspectral will most certainly catch it. And that can mean the difference between saving a coffee bean harvest or a valuable wine-vineyard crop.


 

Tags: hyperspectral, Remote Sensing, UAS, UAV, University of Bristol

Data Fusion: A New Capability for the Remote Sensing Community

Posted by Christopher Van Veen on Tue, Mar 01, 2016

We’re seeing a tremendous increase in the number of airborne deployments for our hyperspectral imaging sensors. To a large degree, the trend toward smaller and more affordable UAVs is giving the remote sensing community more flexibility to undertake more missions to capture meaningful environmental data. From wine-grape vineyards in northern California to coffee bean plantations in South America, the precision agriculture community is embracing packaged ‘UAS’ offerings that combine a UAV matched to the payload it needs to carry.

hypercore_illustration.jpgCollecting meaningful, actionable data for a precision agriculture scientist can mean the difference between a healthy harvest and a disastrous one. Depending on the wavelength, the sensors will spot indices indicative of diseases, irrigation deficits, crop stress, and more. An affordable UAV thus takes the place of much more expensive manned aircraft flights. The financial savings notwithstanding, this new system can be hand launched and retrieved and basically deployed wherever and whenever (with adherence to all local aviation rules and regulations).

One trend we’re seeing at Headwall is the integration of multiple sensors, each having their own specific streams of data. For example, the payload might comprise a VNIR (400-1000nm) sensor along with a SWIR (1000-2500nm) instrument, but might also include LiDAR and typically a GPS/IMU. A Fiber-Optic Downwelling Irradiance Sensor (FODIS) is also often used to measure and collect data relative to changes in solar illumination.

Obviously, payload restrictions determine what the craft can lift and for how long. But it is a balance between choosing an affordable UAV that is small and light while understanding that it might not be able to carry all the instruments that a remote sensing mission might demand. Optimizing Size, Weight & Power (SWaP) is the guiding principle for missions involving UAVs. There are many fixed-wing and multi-rotor UAVs on the market that all specify their payload restrictions and flight durations.

The goal of any remote sensing activity is to see the unseen, and then make sense of the data during post processing. Because this data leads to important agricultural decisions, it’s crucial to synthesize the data from each instrument. The term for this is data fusion, and Headwall has just unveiled a new product called HyperCore™ that handles this important task. As the UAV flies its mission, data streams from the hyperspectral sensors, GPS/IMU, LiDAR, and other instruments are all collected on HyperCore’s 500GB drive for easy download (via Gig-E) later. HyperCore includes the most-used connections, including two Gig-E ports, a CameraLink port, power, and two I/O ports.


 

Tags: Airborne, Remote Sensing, UAS, UAV

Hyperspectral Imaging Technology a New Frontier for KAUST in Saudi Arabia

Posted by Christopher Van Veen on Thu, Oct 08, 2015

King Abdullah University of Science and Technology (KAUST) is a public research institution in Saudi Arabia. By any measure it is a very young university, founded in 2009. But in that short span of time it has rapidly grown to accumulate an astounding number of research and citation records.

Being a science and technology university, KAUST focuses on traditional subjects such as math, electrical engineering, and computer science. KAUST is already well versed in spectroscopy, having a burgeoning lab full of instruments that can peer into the chemical underpinnings of minerals, plants, and crops. The lab is outfitted with spectroscopy, chromatography, and mass spectometry instruments tasked with learning more about trace metals analysis, wet chemistry, and surface analysis.

But one area of study is on pace to become its most popular: Earth and Environmental Sciences. Here, students and faculty pore over ways to use new technology in learning more about precision agriculture, water resources, and atmospheric conditions. This takes spectroscopy out of the lab and into the air, specifically using drones and UAVs.

KAUST_Nano_1_with_caption_small.jpg

Spectroscopy takes several different forms, but multispectral and hyperspectral are of primary interest within the scientific research community. The primary difference between the two is that hyperspectral imaging technology provides complete spectral information for every pixel in the scene...literally hundreds of bands. Comparatively, multispectral will only detect a handful of bands. In other words, it can mean the difference between discovering an invasive disease in a crop field or vineyard and missing it altogether. The Headwall sensors have a wide field of view (FOV) coupled with aberration-corrected optics, meaning that image data is as crisp and precise along the edges of the FOV as it is directly underneath the flight path. More data is collected for each mission, making the data-collection project more efficient. With battery life being key, efficiency matters.  Armed with spectral libraries that define the chemical composition of everything the sensor ‘sees,’ scientists have the ability to look well beyond what is actually ‘visible.’ 

Matthew McCabe, a professor at KAUST, recognizes the value of real-time environmental analysis in the work his Hydrology and Land Observation (HALO) Group does. “We are interested in exploring hyperspectral sensing to better understand plant health and function, particularly as relates to agricultural settings,” said McCabe. “The capacity to retrieve information on plant health is of interest not just for the obvious and important monitoring of crop state, but also in better constraining coupled water-energy-carbon models of vegetation systems.” According to McCabe, these models are generally poorly constrained so a system that enables for investigating plant health and condition in near real-time is of much interest. 

Already familiar with less precise multispectral instruments, McCabe is now investing in airborne hyperspectral sensors that see more and can deliver vast amounts of spectral data. “While we already employ multispectral sensors covering bands known to inform upon plant systems, it is the capacity to further expand our knowledge of plant spectral response that we are most interested in,” said McCabe. “Determining new spectral relationships in plant behavior and response is an area of research that hyperspectral imaging can really drive.”

KAUST_Nano_2_with_caption_small.jpg

The field of crop science is a key deployment for hyperspectral imaging. Here, McCabe chose Headwall’s Nano-Hyperspec sensor that covers the core Visible/Near-Infrared (VNIR) range from 400nm to 1000nm. Most anything that a crop scientist wants to ‘see’ will be found in that VNIR range. “We wanted a very precise spectral imager that operated in this important VNIR range,” said McCabe. “We needed to ‘see’ spectral information for every pixel within the field of view, and we knew that Headwall’s Nano-Hyperspec had very precise edge-to-edge imaging performance that would optimize the flight efficiency of our UAV.

With the newly acquired Nano-Hyperspec mounted aboard a KAUST-engineered quadcopter, McCabe and his team will set off to learn about plant-specific spectral-traits that provide direct insight into health and productivity functions. “This is one of our research goals,” said McCabe. “Hyperspectral sensing and analysis allows us to explore this exciting area of research in ways that multispectral cannot.”

Choosing a sensor and deploying it properly on a UAV is a challenge, especially since the technology is still relatively young. “We needed a sensor package that matched our UAV,” said McCabe. That meant it had to be lightweight and small, and it had to have integrated data storage for the many gigabytes of spectral data pouring into the sensor while aloft. The Headwall solution added a GPS/IMU and full software control for an overall package that put McCabe in the air far sooner than he’d otherwise expect. “Partnering with Headwall gave us months of headway in terms of getting in the air and collecting great data,” said McCabe.

And why Headwall? “We knew the company was a strong and well respected brand in the spectral sensing domain, with a long history of product excellence,” said McCabe. “We had researched a number of competing solutions, but Headwall turned out to be the most professional and competent among them.” McCabe also noted that numerous colleagues and scientists were currently using Headwall hyperspectral instruments with great success.

Moving forward, what are some of the exciting plans in the area of environmental analysis using Headwall’s hyperspectral system? “Our initial research will be focused on the indirect retrieval of plant pigments such as chlorophyll and carotenoids,” said McCabe. This spectral data will provide information on water use and stress condition of agricultural systems. “Ultimately, we are interested in better understanding plant water use through transpiration, but there are also opportunities for better constraining crop-yield through routine spatial sampling that is available when coupled to a UAV,” noted McCabe. “Hyperspectral sensing allows for a number of innovative ways to explore these ideas. But there are also clear opportunities beyond crop science. Indeed, we see applications across many of the multi-disciplinary research areas we are engaged in.” Interestingly but not surprisingly, McCabe is thinking well beyond UAVs: “We also have active projects in satellite validation and expect to engage with collaborators on both small and larger scale crop stress monitoring and even disease mapping using sensors mounted aboard L.E.O. commercial satellites.”

Of course, Headwall will be there when the time comes.

Click to edit your new post...

Tags: Remote Sensing, UAS

We're Giving Drones a Good Name

Posted by Christopher Van Veen on Wed, Oct 07, 2015

Drones seem to be in the news for all the wrong reasons. The media reminds us that they're nothing but nuisances: peeking at people, crashing into stadiums, hovering over the White House, and causing airliners to take evasive maneuvers. The FAA in this country is taking an active stance on the safe operation of drones, and the topic is being explored elsewhere around the globe. What everyone recognizes is that it's a world full of both promise and uncertainty. Indeed, the automobile was born under the same set of circumstances!

Having just returned from a week-long conference in Reno, Nevada, my post today is meant to emphasize the good work drones can do. The biggest among them is precision agriculture, where a drone outfitted with the right instrumentation can hover over orchards and vineyards and spot telltale signs of diseases that aren't readily seen from the ground. Monitoring irrigation levels and fertilizer effectiveness are two other key applications, as are climatology, pipeline monitoring, and geology.

Two makers of UAVs present at the conference are Headwall customers. PrecisionHawk builds a fixed-wing system while ServiceDrones offers a multi-rotor craft. There are reasons for using either. The amount of room you have to take off and land is one consideration; the overall battery life (flight duration) is another; and payload capacity is a third. The key task is to match everything to the mission, which is why integration is so important.

All told, the packaged technology of drones and sensors allows researchers to 'see' the invisible and learn more about the environment. Primarily this is territory largely inaccessible by any other ground-based means, which puts the risk to humans (and airliners) at the lower end of the scale. The use of drones has exploded for two primary reasons. Chief among them is affordability, which positions them much more favorably compared with manned fixed-wing aircraft. Second is ease of use. Drones are now more 'mainstream' than ever, and their ability to carry reasonable instrumentation payloads allows them to do this kind of scientific 'remote sensing.'

Instruments such as hyperspectral sensors are getting smaller, lighter, and more affordable. With them, scientists can now unlock hidden secrets and spot trends by analyzing very detailed, data-rich images.  We are helping to create a 'new set of eyes' for the scientific community. The drones themselves become a vital 'delivery system,' and the pairing of these technologies is giving birth to the kind of conference such as the ASPRS Mapping event in Reno. It was a combination of test flying and presentations, with the flying happening in gorgeous Palomino Valley located about 35 miles north of Reno.

Through it all, safety was paramount during the flying demonstrations. FAA inspectors were with us every step of the way to make sure that all the programmed flight plans were adhered to. Each drone had 'N' registration numbers, as a regular aircraft would. This is serious business with huge upside potential for geologists, crop scientists, the petroleum industry, and for environmentalists. It pays to understand the regulations and work within them, because this whole business is a 'new frontier' for everyone. And while the term 'Drone' conjures up a rather negative image, the more proper description is, "Unmanned Airborne System," or 'UAS' for short. These truly are 'systems' because they pair a flying machine (either fixed-wing or multi-rotor) with instruments they carry.

And what kind of instruments? For precision agriculture, a hyperspectral sensor covering the Visible and Near-Infrared (VNIR) range of 400-1000nm will spot disease conditions on tree canopies. With entire economies depending on crops (hello, Florida citrus!), the ability to spot tree-borne diseases and other plant-stress situations is massively beneficial. First, the instruments are precise and can spot the 'invisible.' Second, the drones allow for the rapid and complete coverage of remote areas that might take days or weeks to map. And perhaps most telling, some disease conditions will only be visible from the top down rather than from the bottom up. An inspector on a ladder under a tree will likely miss something that the drone spots, and this can mean the difference between a bountiful harvest and a financial catastrophe. Any high-value crop (think citrus, wine grapes, pistachios, coffee beans, walnuts, etc.) needs this kind of imaging oversight. Our Nano-Hyperspec is extremely popular for this kind of work.

When it comes to airborne work, one of the most desired attributes of a hyperspectral sensor is a wide field of view. Simply put, the sensor needs to deliver crisp hyperspectral data at the edges of its field of view just as it would directly underneath the flight path. The wider and more sharp the field of view, the more efficient the flight path can be. And when it comes to drones, battery life determines the overall flight duration. So a hyperspectral sensor having an aberration-corrected wide field of view can cover more ground for a given flight envelope. More image data is thus collected for every flight, making the research project very efficient.

In addition to hyperspectral sensors, drones will also need a GPS to tie the incoming spectral data to its exact geographic location. Another frequently asked-for instrument is LiDAR (Light Detection and Ranging Sensor), which provides some elevation detail that is paired with the hyperspectral data. Obviously the combination of all these separate instruments makes for a payload that consumes valuable weight and space, and thus out of the realm of possibility for today's new breed of hand-launched UAVs. With that in mind, my company (Headwall Photonics, Inc.) takes time to engineer and 'integrate' the sensor so that it is as small and as light as possible. Combining the data storage inside the sensor is one way; direct-attaching the GPS is another. The connecting cables you don't need mean weight you don't have to lift!

Finally, conferences like the ASPRS event in Reno are places where people can learn. Understanding the challenges and potential integration pitfalls is what we at Headwall were there to convey, and our message was very well received. The mistake we all want to avoid is having users blinded by the promise of airborne hyperspectral imaging, dashing off and grabbing any affordable UAV and bolting instruments onto it. For one, such an approach is dangerously naive. Second, the time needed to integrate everything is practically always underestimated. And third, it becomes a very costly endeavor when the price of time is factored in.

At Headwall, although our business is the production of the industry's best hyperspectral imaging sensors, we understand integration issues better than anyone. We're here to help navigate the process and get the scientific research community in the air faster, doing all the good things 'drones' can do.

Tags: hyperspectral, Remote Sensing, Sensors, UAS, VNIR, UAV

Nano-Hyperspec...in the air and on the ground

Posted by Christopher Van Veen on Fri, Feb 06, 2015

Next week during Photonics West we’ll be demonstrating our very newest hyperspectral sensor: Nano-Hyperspec. We gave it that name because it’s small...exceptionally small. Think of a Rubik's Cube and you've got it. The market said it needed a robust, aberration-corrected hyperspectral sensor purpose-built for small, hand-launched UAVs. One perfect example is the X6 from the Aibotix division of Leica-Geosystems, a company with whom Headwall signed an agreement in late 2014. “There’s a confluence within the remote sensing marketplace,” said Headwall CEO David Bannon. “The attractiveness of affordable, easy to launch UAVs runs headlong into the need for perfectly matched sensor instruments that they can carry.” In conceiving Nano-Hyperspec, Headwall consolidated and integrated as much as possible to yield a small, performance-packed unit that even the smallest UAVs could easily carry.

“Ordinarily, a hyperspectral sensor talks to a separate computer in order to transfer large amounts of image data quickly,” noted Bannon. “But small UAVs don’t have the payload capacity to carry a separate data-processing unit and the cables they require.” So the first order of business was to put the data processing and storage technology into the sensor itself, which frees up space for other accessories. For proper image-data collection from a UAV, the hyperspectral sensor needs to work along with a GPS. Nano-Hyperspec was designed so that the GPS can attach directly to the housing, further saving weight and space. “Integrating these normally disparate pieces into an integrated whole is what the market continually tells us it needs,” noted Bannon. “All of this not only makes for a lightweight sensor package, but also allows for the addition of technology such as LiDAR, which itself is collecting valuable data for scientists to use.”

Nano Leica logoNano-Hyperspec focuses on the Visible and Near-Infrared spectral range (often referred to as ‘VNIR’) of 400-1000nm. “Much of what needs to be seen from a UAV is taken at slow speeds and low altitudes,” said Bannon. This can be precision agriculture, environmental monitoring, minerals and geology, or any of a number of other uses. But to a large degree, what becomes visible to a hyperspectral sensor between 400 and 1000nm can include the presence of disease conditions on a tree canopy where it otherwise might be invisible from below. “Entire economies depend on agriculture,” said Bannon. “If a low-flying UAV with our specially-tuned hyperspectral sensor can ‘see’ an invasive disease, our technology becomes vital rather than simply desired.”

One of the hallmarks of all Headwall sensor designs is aberration-correction. In simple terms, this means making sure that the sensor sees as crisply and clearly off to the edges of its field of view is as it does straight beneath the line of flight. The holographic diffraction grating embedded within each sensor is designed to make this so, by eliminating unwanted artifacts such as ‘keystone’ and ‘smile’ that are more pronounced off to the edges of the field of view. “In practical terms, it means that the sensor has a very wide field of view that is accurately represented,” said Bannon. A wider view means a more efficient flight path. In short, the UAV can cover more ground because it can accurately ‘see’ more ground. This is particularly crucial because UAVs are battery-powered; the objective is to maximize useful work in the limited time aloft. A wide view of the ground at exceptionally high spatial and spectral resolution allows this to be so.

holographic gratingsIt has been said that people buy holes, not drills. They basically have a problem that needs an answer. How they get their hole or derive their answer is an exercise in technology, economics, and speed. “We have a technical solution that is affordably priced,” said Bannon. The partnership with Leica-Geosystems helps. “Time-to-deploy is an exercise in economics and lost opportunity because real value can be derived the sooner the UAV/hyperspectral package is airborne and collecting useful data.”

Not lost on the remote-sensing community is this: many applications involve taking image data from the ground rather than from a UAV. Nano-Hyperspec is easily attached to a tripod and a rotational stage so that the necessary movement (which ordinarily would come from a UAV) instead happens from a ‘stationary’ platform. These deployments are sometimes called ‘point-and-stare’ or ‘pan-and-tilt,’ and it represents a means of accomplishing movement-based hyperspectral imaging on the ground.

Headwall's booth at Photonics West (Moscone Convention Center, San Francisco) is 2506. Hope to see you there!

Tags: hyperspectral imaging, Headwall Photonics, Remote Sensing, UAS, UAV, Leica-Geosystems

Headwall Delivers Micro-Hyperspec® Sensors to Columbia University

Posted by Christopher Van Veen on Thu, Oct 09, 2014

High-performance imaging sensors on small, commercial UAS will assess ocean and sea ice variability in Arctic zones

FITCHBURG, MA - OCTOBER 9, 2014: Headwall Photonics has delivered two high-performance hyperspectral imaging sensors to Columbia University as part of its Air-Sea-Ice Physics and Biogeochemistry Experiment (ASIPBEX). ASIPBEX is part of a larger international collaborative investigation of Climate Cryosphere Interaction with colleagues from Spain, Germany and Norway. This crucial remote-sensing project will use a high-endurance unmanned aircraft system (UAS) to investigate climatological changes present in the Arctic Ocean around Svalbard, Norway. The instrument payload comprises two Micro-Hyperpsec sensors; one will cover the Visible-Near-Infrared (VNIR) range of 400-1000nm while the other will cover the Near-Infrared (NIR) range of 900-1700nm. Together, the sensors will be crucial in detecting indicators of sea ice physics, solar warming and global carbon cycles.

 

UAS and Micro-Hyperspec"We chose the Headwall sensors for several reasons," stated Christopher Zappa, a Lamont Research Professor at Columbia's Lamont-Doherty Earth Observatory. "The very high resolution allows us to collect and process vast amounts of spectral and spatial data upon which our research and analysis depend." The wide field of view of the Headwall sensor combined with aberration-corrected optics also contributes to overall flight-path efficiency. The UAS allows scientists to measure in places that typically are impossible to get to using ships or manned aircraft. This opens up the possibility for transformative understanding of the climate system. "Since we're using a UAS, we depend on 'seeing' as much of the ocean surface as possible, minimizing any aberrations or unwanted artifacts along the edges of the field of view," noted Prof. Zappa. The combination of Micro-Hyperspec and Headwall's advanced Hyperspec III airborne software allows for the successful collection, classification, and interpretation of the spectral data collected during each flight.

 

This particular deployment for the ASIPBEX project is fundamental to Headwall's strategy of advancing the science of remote sensing aboard small, commercial unmanned aircraft systems. "Hyperspectral represents a crucial payload for any manned or unmanned deployment," noted Headwall CEO David Bannon. "But significantly notable is that the UAS has become a 'go-to' platform. This means not only smaller and lighter sensors, but also integrated solutions that factor in everything from LiDAR and data-management to post-processing tasks such as ortho-rectification that our software can handle." Because the Micro-Hyperspec sensor uses high efficiency diffraction gratings in a concentric, optical design, imaging performance and signal-to-noise are both maximized. The patented optical design provides a package that is rugged and robust for airborne use in harsh environments such as the Arctic ocean.

 

The Observatory for Air-Sea Interaction Studies (OASIS) 

Led by Professor Christopher Zappa, the Observatory for Air-Sea Interaction Studies (OASIS) conducts research in a variety of fields focused on the oceanic and atmospheric boundary layers. These include wave dynamics and wave breaking, air-sea CO2 gas exchange, non-satellite remote sensing and boundary-layer processes. Affiliated with the Lamont-Doherty Earth Observatory (LDEO) and Columbia University, OASIS is involved in joint projects with the Polar Geophysics Group of LDEO, Yale University, the University of Heidelberg, the University of Connecticut, and the University of New South Wales and participated in various large multi-institution projects such as CBLAST-Low, GasEx, VOCALs, RaDyO, DYNAMO.  

The group develops and deploys instruments including infrared, multispectral, and polarimetric cameras on different fixed and mobile platforms such as ships, aircrafts, buoys. The study areas range from laboratory wind-wave tanks, Biosphere2, to local rivers and estuaries, to shelf seas and polynyas, to open ocean from the poles to the equator.


For information contact:

Professor Christopher J. Zappa, Lamont Research Professor 

Lamont-Doherty Earth Observatory 

[email protected]

Tags: hyperspectral imaging, Airborne, Remote Sensing, Micro Hyperspec, UAS

Smaller, Lighter, Better: Hyperspectral on UAVs

Posted by Christopher Van Veen on Fri, Aug 08, 2014

At Headwall we've been busy listening to the market. When it comes to airborne remote sensing, the market is telling us that they favor UAVs (unmanned aerial vehicles) of all kinds: fixed-wing, multi-rotor, and so on. There's no end to the number of companies producing UAVs globally. Because many UAVs produced today are very small and affordable they are 'within reach' of those with even modest means. Universities represent one key market where the use of UAVs is rapidly increasing. Full of scientists and research departments, universities around the globe see these small and light UAVs as a perfect platform from which to launch their exploratory studies. They are affordable, easy to assembly and transport, and (especially with multi-rotor models) can take off and land within a very small footprint.

UAV with NanoBut alongside all this enthusiasm for UAVs, there are many who frown upon these airborne vehicles and see them as a nuisance. Indeed, they can be a nuisance when used for trivial pursuits. In densely-populated areas they certainly can be more than an annoyance...they can be dangerous. But largely, the work we are seeing our customers undertake with hyperspectral imagers attached to UAVs is very valuable work indeed. And it takes place far from the hustle and bustle of any urban landscape. For example, precision agriculture is made more valuable because there are key indices to plant health and physiology that are readily seen from above than from below. Certain disease conditions are ‘visible’ using hyperspectral imaging, especially with high spectral and spatial resolution found on all Headwall sensors.  Other research pursuits include environmental analysis, geology, pollution analysis, and so many more. These are very good and valuable scientific efforts made moreso by the UAVs that enable these precision instruments to 'fly.' The marriage between hyperspectral and UAV seems to be a perfect one, especially when you consider how much ground can be covered with one of these flying wizards. And especially when you realize that hyperspectral imaging fundamentally requires that movement needs to occur. In other words, hyperspectral was meant for airborne deployment. Where a Jeep can’t go, a UAV can. And furthermore, more ground can be covered with a UAV, meaning more efficient data collection over rugged and inaccessible landscapes.

Nano-HyperspecAs UAVs get smaller and lighter, users run headlong into the issue of payload: UAVs are limited with respect to what they can lift. Whatever else a UAV is asked to carry, it needs to lift batteries. Then comes the instrumentation. Headwall’s Nano-Hyperspec was just introduced for the VNIR (400-1000nm) spectral range. Most (but not all) of the things a research scientist might wish to ‘see’ are visible in this spectral range. But we did a couple things with Nano-Hyperspec that helps the payload issue. First, the size and weight are well below previous sensor offerings. Its size (including lens) is a scant 3” x 3” x 4.72” (76.2mm x 76.2mm x 119.2mm), and its weight is less that 1.5 lb. (0.68kg). Best of all, this includes on-board data storage of 480GB. That’s about 130 minutes at 100fps.

Aside from making Nano-Hyperspec smaller and lighter than other hyperspectral sensors, a key differentiator comes from embedding the data storage within the enclosure while providing multiple attach points for the GPS/INU. Another key attribute is the inclusion of the full airborne version of Headwall’s Hyperspec III software, which includes a polygon flight tool for sensor operation and a real-time Ethernet Waterfall display. While the work to shrink the size and weight of Nano-Hyperspec is valuable by itself, it does allow the user more room and available payload to carry other instrumentation. Hyperspectral combined with LiDAR and thermal imaging is an extremely valuable package that is made possible thanks to the overall size/weight reduction of Nano-Hyperspec and the embedding of the data storage/management capabilities (which were contained within a separate enclosure previously).

Hyperspec III software gives users full control over data acquisition, sensor operation, and datacube creation in ENVI-compatible format. Hyperspec III also works in full conjunction with the GPS that can be paired with the sensor as an available Airborne Package. In this optional package, customers are able to take advantage of real-time computation of inertial enhanced position/velocity, ~161dBm tracking sensitivity, accurate 360-degree #D orientation output of attitude and heading, correlation of image data to GPS data, and much more. During post-processing, the Airborne Package also effortlessly handles radiometric calibration and conversion as well as orthorectification.

 

 


Tags: Airborne, Remote Sensing, UAV, agriculture, precision agriculture