Headwall Photonics Blog

Almond Inspection Goes High-Tech

Posted by Christopher Van Veen on Fri, Nov 18, 2016

In Manteca, CA about 90 minutes south of Sacramento sits Travaille and Phippen, a family-run business focused on processing the world's finest almonds. Manteca sits at the heart of California's agricultural valley, which is ripe with growers and processing companies of everything from avocados to nuts. Chances are, the produce that you pick up at your local grocery store comes from this fertile part of California that stretches hundreds of miles from north to south.

Almonds are increasingly America's favorite variety of nut, with consumption rate increases that are far surpassing those of even peanuts. And this preference is not limited to America; most other parts of the world share this craze for almonds. Indeed, the acknowledged health properties of almonds make them a favorite, guilt-free snack.

Sitting between growing and eating almonds sits an important phase of the operation: Processing and inspection. At Travaille and Phippen, there is a recognition that diligent inspection protocols can result in a rapid payback: Consumers see a higher-quality product and government regulators appreciate their proactive approach to quality and safety. Scott Phippen is CEO of Travaille and Phippen, and he has made this connection while many of his competitors have not. "Our success comes from delivering pristine quality almonds globally," he said. "Whether it's Japan, the Middle East, or here in America, there are differing preferences for what consumers like to see. But what doesn't differ is a need for highest quality."

 

 

One of the approaches Phippen embraced was to adopt spectral imaging technology near the end of the inspection process. The normal 'upstream' processes are designed to take out foreign material such as twigs, rocks and bits of shell. As 'good' product moves through to the final inspection stages, spectral imaging takes over to deliver a more granular look at the almonds. "Coloration is a huge grading factor," said Phippen. "You can have two or three really good-looking almonds in your inspection stream, with each having a slightly different tone or color. Spectral imaging allows us to segregate these 'good but different' almonds better than we ever could."

Getting Travaille and Phippen to this stage was a classic exercise in integration between Headwall and Bratney Companies (Des Moines, IA). Bratney provided several fully integrated inspection lines that featured conveyors, sensors, vacuum robots, and software control that allowed Travaille and Phippen to significantly boost its inspection quality while using 60% fewer people. For Peter Bratney, CEO of Bratney Companies, the overall marriage between varying but complimentary technologies is paying dividends. "Travaille and Phippen is just one example of our integration expertise delivering a demonstrable return on investment," said Peter. "Headwall's hyperspectral imaging sensors can 'see' with a resolution and clarity that allows any food processing company to grade or distinguish its product better than ever."

As almonds enter the final inspection stream, small hyperspectral sensors silently scan from above. The sensors are armed with spectral libraries (basically algorithms) that represent 'instruction sets' for the downstream robotic arms a few feet down the line. Almonds that match a pre-determined spectral signature are allowed to pass while ones that don't are vacuumed away. These might be cracked almonds, or even bits of foreign material not captured during earlier inspection phases. But color uniformity is important in the almond business, and the Bratney solution allows Travaille and Phippen to distinguish between almonds with miniscule coloration differences. “This is where advanced machine vision is headed,” said Bratney. “The key is in identifying cutting-edge technologies and then integrating them for the common good.”

The food-processing industry understands imaging technology to a degree. RGB (red, green, blue) cameras are familiar. But today, with product quality and safety on everyone's mind, a finer net is needed. The push toward 'Advanced Machine Vision' is gathering momentum as food processing companies recognize that new types of spectral imaging technologies are needed. "Travaille and Phippen is capturing attention in the competitive global almond market because of their proactive approach to technology,” said Bratney. “This allows them to deliver a level of quality that’s hard for others to match.” 

Tags: hyperspectral, Almonds, almond inspection, Manteca, CA, Travaille and Phippen, Bratney Companies, advanced machine vision

Hyperspectral Imaging & Processing Takes Hold in Turkey

Posted by Kevin Lynch on Mon, Jun 06, 2016

Hyperspectral imaging is a very valuable scientific technique for a wide range of applications that extend far beyond the defence and military uses the technology was originally conceived for. These include airborne precision agriculture, advanced machine vision, forensics, medical science, cultural preservation, and a range of others. For scientists in these application areas, hyperspectral imaging is often a new tool where the need for education is quite common. As a leader in the field of hyperspectral imaging, Headwall partners with its international resellers to sponsor workshops and events that are geared toward promoting an understanding of hyperspectral imaging and how best to use the rich data coming from the sensors.

Headwall and its valued partner in Turkey (Visratek) recently sponsored a "Workshop on Image and signal Processing" organized by IEEE GRSS Turkey Chapter under the "24th Signal Processing and Communications Applications" (the abbreviated Turkish name is SIU).  Held annually since 1993, SIU has organized in various cities throughout Turkey. In 2016, the location was in Zonguldak, sponsored by Bulent Ecevit University. The events are very well attended with the 2015 SIU attracting more than 700 submitted scientific papers written in Turkish with an abstract in English. A technical program committee of more than 600 experts provided more than 1,500 paper reviews. Attendence is roughly 600 leading scientists, researchers, and industry practitioners from Turkey. Says Visratek head Fatih Ömrüuzun, "This is the event where visibility is very important; our linkage with Headwall and our sporsorship of SIU puts us in a very good position as expert in this field."

Visratek_1.jpg

Visratek_2.jpg

The IEEE SIU Conference is therefore the most significant scientific event of the signal processing and communications community in Turkey. The conference is four-day event. The first three days are dedicated to keynote talks from distinguished lecturers from worldwide universities. Also, tutorials and technical sessions include oral or poster presentations of scientific papers. 24th IEEE SIU conference also featured an industry forum and exhibition program where Headwall and Visratek were prominent. The fourth day of the conference was dedicated to the social programs including day trips to beautiful and historical towns such as Amasra, Safranbolu and Kdz. Ereğli) at the coast of the West Black Sea Region in Turkey.

Tags: hyperspectral, signal processing, Visratek, IEEE, Turkey

Landmine Detection Using Hyperspectral Imaging

Posted by Christopher Van Veen on Wed, Apr 20, 2016

 

 

When you see how rapidly the use of drones for scientific research has risen, you first conclude that it's all about precision agriculture and climatology. To be sure, those are trendsetting applications when it comes to using hyperspectral imaging sensors aboard UAVs. But over at the University of Bristol, two scientists are leading a team focused on 'finding a better way' to detect the presence of landmines that kill or maim thousands of people annually. With around 100 million landmines underneath the ground globally, traditional means of finding and eliminating them would take about 1,000 years and cost upwards of $30 billion according to some estimates.

Find A Better Way is a UK-based NFP founded by famed footballer Sir Bobby Charlton. Despite his heroic sporting achievements, Sir Bobby is now forging a legacy outside of football through his determination to champion the cause of landmine detection and elimination. He witnessed the destruction caused by landmines on visits to Cambodia and Bosnia as a Laureus Sport for Good Ambassador. He founded Find A Better Way after recognizing that research and development held the key to making the major changes necessary to allow humanitarian teams to rid the world of the threat of landmines.

"We want to do something that very quickly delivers a step-change in capability while reducing overall human risk involved with finding and eliminating landmines," said Dr. Tom Scott of the University of Bristol. He along with Dr. John Day are pairing their advanced UAV with small and lightweight hyperspectral imaging sensors from Headwall Photonics to 'see' with a specificity and resolution unheard of only a few short years ago. "These drones can be autonomously deployed to fly over a landmine area and provide high-resolution images that allow us to reconstruct the 3-D terrain with very high accuracy," said Dr. Scott. With all the landmines across the world, tactical deployment of numerous low-flying drones is going to win the day over expensive satellites or high-flying aircraft. In order to meet this objective, the package needs to be simultaneously affordable, light, and suited to its mission. There is a vast amount of integration and testing work involved before the first meaningful flights can be flown. Recognizing this, Headwall is assuming much of this work so that users can compress this time-to-deployment significantly. Because the use of drones for scientific research is still in its infancy, misconceptions abound. Acquiring a UAV and slapping a hyperspectral sensor to it without first considering all the variables is a recipe for disaster. This holds true whether the mission is landmine detection or precision agriculture. More commonly, other instruments such as LiDAR and GPS are part of the payload as well. The end result is a carefully balanced exercise in aerodynamics, optics, electrics, and data-collection that companies such as Headwall are able to manage.

The human eye can only respond to wavelengths between roughly 390-700nm. Many of the reflective 'signatures' given off by plants and chemicals fall outside that range. For example, the Nano-Hyperspec sensor used by the Bristol team operates in what is called the 'Visible-Near-Infrared' range of 400-1000nm (called 'VNIR' for short). In that range, the sensor is 'seeing' with an extraordinarily high degree of specificity and resolution and far beyond what a human could discern. Indeed, these sensors are collecting an astounding amount of spectral data on a per-pixel basis, resulting in 'data cubes' many Gigabytes in size. Armed with spectral libraries that faithfully characterize the specifics of the terrain below, scientists can match the known library information with the collected airborne data and make quite accurate calls on what's what.

The use of drones has accelerated this scientific effort because of two factors: First, they are affordable and easy to deploy on a tactical basis. One can be packed in a Range Rover and deployed almost anywhere in minutes whereas aircraft and satellites are, by nature, constricted, inflexible, and costly. This is not to say that aircraft and satellites will be supplanted by UAVs. There is valuable image data that can be collected using these high-flying platforms, and the overall knowledge base shared by the scientific community is made much more complete when all these assets are used in a synergistic fashion. Indeed, hyperspectral imaging was once the province of high-flying reconnaissance planes and satellites...neither of which could ever be used economically by university scientists. But the ubiquitous drone--a bane to some and a blessing to others--is the perfect platform from which to launch these exploratory efforts. "We're adding a bit more science to the UAV payload now," says Dr. Day. "We're starting to look at the spectrum of light and the colors of light that are coming off the minefield and using that data to find where the landmines are."

UAVs and drones seem to get media attention for all the wrong reasons, which is exactly why efforts by the esteemed team at the University of Bristol are to be applauded for developing a 'Better Way' to solve some of our toughest challenges. Hyperspectral imaging sensors can 'see' even beyond the VNIR range of interest to Dr. Day and Dr. Scott. The Shortwave-Infrared (SWIR) range starts near where VNIR leaves off, covering around 950-2500nm. The presence of certain chemicals, minerals and of course plant photosynthesis will become visible to sensors like these. Indeed, a broadband sensor package that covers the VNIR and SWIR range (400-2500nm) is particularly useful because it basically collects everything a scientific research effort might wish to see.

There are two key factors about hyperspectral imaging that are worth noting. The first is that the technology depends on 'reflected light.' The sensor is basically looking at how sunlight reflects off certain materials. Plant fluorescence, for example, has a particular 'spectral signature' that a sensor can understand. Obviously, this means an airborne hyperspectral sensor depends on a healthy amount of solar illumination and certainly is useless at night. But Headwall's sensors are designed to collect precise image data even under less-than-ideal solar conditions (cloud cover, or low angles, for example). The second factor is having a wide field of view. The sensor obviously can 'see' directly beneath the line of flight, but being able to do so off to the wide edges of the flight pattern makes the mission more efficient. Batteries being what they are, optimizing the flight duration by capturing a wide swath of land is obviously beneficial. This benefit is seen in the precise optical layout used by Headwall in the construction of each sensor.

Crop science, climatology, geology, and even the inspection of infrastructure such as pipelines and rail bed depend on imaging sensors like those produced by Headwall. Hyperspectral sensors depend on 'motion,' since they basically collect images slice by slice as the UAV flies over the scene. The combination of all of these high-resolution 'slices' comprise what is known as a 'data cube,' which is pored over by scientists during post-processing. Of course, the hardware capturing these images represents about half the story. The other half can be found in the software that makes sense out of reams of spectral image data that all needs to be 'geo-tagged' and orthorectified. First and foremost, scientists need answers; the data (and the sensor collecting the data) are simply means to an end. When you go to your local DIY or Lowe's or Home Depot, you really aren't buying a drill; you're going there to buy a hole.

But is all that image data really needed? Some efforts seek to cut corners by using less-capable 'multispectral' sensors that cover only a few bands rather than the hundreds of bands covered with hyperspectral. Using crop science as an example, a multispectral sensor might miss the telltale signature of an invasive disease on a tree canopy while hyperspectral will most certainly catch it. And that can mean the difference between saving a coffee bean harvest or a valuable wine-vineyard crop.


 

Tags: hyperspectral, Remote Sensing, UAS, UAV, University of Bristol

We're Giving Drones a Good Name

Posted by Christopher Van Veen on Wed, Oct 07, 2015

Drones seem to be in the news for all the wrong reasons. The media reminds us that they're nothing but nuisances: peeking at people, crashing into stadiums, hovering over the White House, and causing airliners to take evasive maneuvers. The FAA in this country is taking an active stance on the safe operation of drones, and the topic is being explored elsewhere around the globe. What everyone recognizes is that it's a world full of both promise and uncertainty. Indeed, the automobile was born under the same set of circumstances!

Having just returned from a week-long conference in Reno, Nevada, my post today is meant to emphasize the good work drones can do. The biggest among them is precision agriculture, where a drone outfitted with the right instrumentation can hover over orchards and vineyards and spot telltale signs of diseases that aren't readily seen from the ground. Monitoring irrigation levels and fertilizer effectiveness are two other key applications, as are climatology, pipeline monitoring, and geology.

Two makers of UAVs present at the conference are Headwall customers. PrecisionHawk builds a fixed-wing system while ServiceDrones offers a multi-rotor craft. There are reasons for using either. The amount of room you have to take off and land is one consideration; the overall battery life (flight duration) is another; and payload capacity is a third. The key task is to match everything to the mission, which is why integration is so important.

All told, the packaged technology of drones and sensors allows researchers to 'see' the invisible and learn more about the environment. Primarily this is territory largely inaccessible by any other ground-based means, which puts the risk to humans (and airliners) at the lower end of the scale. The use of drones has exploded for two primary reasons. Chief among them is affordability, which positions them much more favorably compared with manned fixed-wing aircraft. Second is ease of use. Drones are now more 'mainstream' than ever, and their ability to carry reasonable instrumentation payloads allows them to do this kind of scientific 'remote sensing.'

Instruments such as hyperspectral sensors are getting smaller, lighter, and more affordable. With them, scientists can now unlock hidden secrets and spot trends by analyzing very detailed, data-rich images.  We are helping to create a 'new set of eyes' for the scientific community. The drones themselves become a vital 'delivery system,' and the pairing of these technologies is giving birth to the kind of conference such as the ASPRS Mapping event in Reno. It was a combination of test flying and presentations, with the flying happening in gorgeous Palomino Valley located about 35 miles north of Reno.

Through it all, safety was paramount during the flying demonstrations. FAA inspectors were with us every step of the way to make sure that all the programmed flight plans were adhered to. Each drone had 'N' registration numbers, as a regular aircraft would. This is serious business with huge upside potential for geologists, crop scientists, the petroleum industry, and for environmentalists. It pays to understand the regulations and work within them, because this whole business is a 'new frontier' for everyone. And while the term 'Drone' conjures up a rather negative image, the more proper description is, "Unmanned Airborne System," or 'UAS' for short. These truly are 'systems' because they pair a flying machine (either fixed-wing or multi-rotor) with instruments they carry.

And what kind of instruments? For precision agriculture, a hyperspectral sensor covering the Visible and Near-Infrared (VNIR) range of 400-1000nm will spot disease conditions on tree canopies. With entire economies depending on crops (hello, Florida citrus!), the ability to spot tree-borne diseases and other plant-stress situations is massively beneficial. First, the instruments are precise and can spot the 'invisible.' Second, the drones allow for the rapid and complete coverage of remote areas that might take days or weeks to map. And perhaps most telling, some disease conditions will only be visible from the top down rather than from the bottom up. An inspector on a ladder under a tree will likely miss something that the drone spots, and this can mean the difference between a bountiful harvest and a financial catastrophe. Any high-value crop (think citrus, wine grapes, pistachios, coffee beans, walnuts, etc.) needs this kind of imaging oversight. Our Nano-Hyperspec is extremely popular for this kind of work.

When it comes to airborne work, one of the most desired attributes of a hyperspectral sensor is a wide field of view. Simply put, the sensor needs to deliver crisp hyperspectral data at the edges of its field of view just as it would directly underneath the flight path. The wider and more sharp the field of view, the more efficient the flight path can be. And when it comes to drones, battery life determines the overall flight duration. So a hyperspectral sensor having an aberration-corrected wide field of view can cover more ground for a given flight envelope. More image data is thus collected for every flight, making the research project very efficient.

In addition to hyperspectral sensors, drones will also need a GPS to tie the incoming spectral data to its exact geographic location. Another frequently asked-for instrument is LiDAR (Light Detection and Ranging Sensor), which provides some elevation detail that is paired with the hyperspectral data. Obviously the combination of all these separate instruments makes for a payload that consumes valuable weight and space, and thus out of the realm of possibility for today's new breed of hand-launched UAVs. With that in mind, my company (Headwall Photonics, Inc.) takes time to engineer and 'integrate' the sensor so that it is as small and as light as possible. Combining the data storage inside the sensor is one way; direct-attaching the GPS is another. The connecting cables you don't need mean weight you don't have to lift!

Finally, conferences like the ASPRS event in Reno are places where people can learn. Understanding the challenges and potential integration pitfalls is what we at Headwall were there to convey, and our message was very well received. The mistake we all want to avoid is having users blinded by the promise of airborne hyperspectral imaging, dashing off and grabbing any affordable UAV and bolting instruments onto it. For one, such an approach is dangerously naive. Second, the time needed to integrate everything is practically always underestimated. And third, it becomes a very costly endeavor when the price of time is factored in.

At Headwall, although our business is the production of the industry's best hyperspectral imaging sensors, we understand integration issues better than anyone. We're here to help navigate the process and get the scientific research community in the air faster, doing all the good things 'drones' can do.

Tags: hyperspectral, Remote Sensing, Sensors, UAS, VNIR, UAV

Headwall Names Tom Breen as Director of Global Sales

Posted by Christopher Van Veen on Fri, Jun 06, 2014

Growth Markets Require Solid Industry Background Across Commercial and Defense Markets 

Fitchburg, MA – June 6, 2014 – With a rapid expansion of international business, Headwall Photonics announced today that Tom Breen has joined the Company as Director of Global Sales. Tom brings with him significant experience across many of the end-user markets served by Headwall. He will be responsible for managing Headwall’s growing worldwide sales activities and strategic opportunities for hyperspectral and Raman imagers as well as the Company’s OEM integrated spectral instrumentation.

Tom BreenPrior to joining Headwall, Tom held executive leadership positions at UTC Aerospace Systems where he was responsible for sales and business development of airborne and hand-held products. He also served as Vice President of Sales and Marketing for General Dynamic’s Axsys Technology Division in Nashua, New Hampshire. Other senior management positions at L-3 Communications, BAE Systems, and Lockheed Martin provided Tom with the background that will allow Headwall to grow its business in the hyperspectral imaging market.

“We are thrilled that Tom has joined our team,” said Headwall CEO David Bannon. “His background complements our commercial growth plans seamlessly and he will be a terrific asset in tackling a market that is experiencing very robust growth. Tom has had significant success in building high performance sales teams coupled with exceptional customer relationships.”

“I am very excited to be joining Headwall at a period of tremendous momentum for the Company and the industry,” said Tom. “As a leading supplier of spectral instrumentation, Headwall is uniquely poised to expand and deliver hyperspectral sensors and OEM instruments for remote sensing and in-line applications.”

Headwall’s award-winning Hyperspec and Raman imagers are used in commercial and military airborne applications, in advanced machine-vision systems, for document and artifact care, for plant genomics, in medicine and biotechnology, and for remote sensing. A unique differentiator for the Company is Headwall’s patented all-reflective, aberration-corrected optical technology that is fundamental to every system it produces.

Tom is a published author, with numerous works produced for IEEE, SPIE, and AAAE. Tom’s educational background includes MBA and BSEE degrees from Northeastern University in Boston.

Tags: hyperspectral, Headwall Photonics, Headwall, Tom Breen, Sales

Far Away and Long Ago: Hyperspectral Imaging Plays Key Role

Posted by Christopher Van Veen on Mon, Jun 02, 2014

Hyperspectral imaging sheds new light on prized Martian rock specimen

Scientists have forever been fascinated with space. What’s up there? Does life as we know it exist elsewhere? Is there any other celestial body like earth?  While these questions might lack solid and precise answers, it’s not for lack of trying. Knowledge often comes not from massive ‘Ah-HA!’ moments, but from smaller discoveries.  When stitched together, these jewels of learning present a useful mosaic for future scientists.

Black BeautyA two billion-year-old meteorite—officially named NWA 7034 but nicknamed Black Beauty by scientists—recently crashed into the Sahara desert. It was found by scientists in 2011 and determined to be of Martian origin two years later. The geologic history of Mars has always been a fertile source of exploration given the never-ending interest in this relatively nearby yet mysterious planet. While exploring the Martian landscape provides a wealth of scientific data, this meteorite has itself been a goldmine of information. Why? Because it sheds light not on the Mars of here-and-now, but on what we believe happened 2.1 billion years ago to its geologic interior and surface.

The Black Beauty meteorite was lofted off the martian surface by a large impact, an explosive geologic event. The intrinsic value of the rocks can be appreciated mostly because they carry a snapshot of what the conditions were like on Mars at the moment the impact occurred. The Mars of today is fascinating, yes, but to have a sample of Mars from 2.1 billion years ago is more fascinating still. Indeed, Black Beauty is significantly older than almost all other Martian meteorites yet found.

In early 2014, a Brown University research team led by Dr. Jack Mustard and graduate student Kevin Cannon temporarily acquired a slice of Black Beauty from Dr. Carl B. Agee, Director of the Institute of Meteoritics at New Mexico University. Brown University analysis included hyperspectral imaging using Headwall’s VNIR (380-1000nm) and SWIR (950-2500nm) sensors to extract a wealth of meaningful spectral data. "We were really presented with a one-of-a-kind specimen in Black Beauty," noted Dr. Mustard. "We wanted to learn as much as we could and add to the body of geologic knowledge already accumulated."

BlackBeauty Headwall colorThe team paired the two sensors in Headwall’s 'Starter Kit' configuration, which comprises a moving stage, necessary and proper illumination, and full software control to manage the collection and post-processing of the incoming data. "What we saw as we ‘unpacked’ the data is that Black Beauty is rich in information that give us a clue as to what Mars was like over two billion years ago," said Cannon. "While rovers on Mars today are extracting important new data, to have an actual sample that we can analyze with our most sophisticated instruments is exciting."

In the adjacent hyperspectral image of Black Beauty, features become clear. The mineral feldspar shows up as green, and the mineral pyroxene comes out as yellow/red. "These two minerals make up most of the Martian crust, so it's exciting that we can see them and map them out spatially in the data," said Cannon.

There are a few characteristics of hyperspectral imaging that make it perfect for this sort of work. First, it is a non-invasive technology. That is, no samples are harmed or even touched. This is crucial, and the non-invasive nature of hyperspectral imaging lends itself not only to the study of Martian rocks like Black Beauty, but also the field of fine arts, artifacts and antiquities. Museums and collection-care experts are themselves seeing the value of hyperspectral imaging because of the amount of new information that can be collected non-invasively.

As a scanning technology, hyperspectral imaging is designed to ‘see the unseen’ and unlock the answers to challenging questions. There are numerous ‘imaging’ and ‘scanning’ techniques available to the scientific research community, but none possess the vast spatial and spectral information collected by Headwall’s instruments. "What we have been able to do is successfully introduce a brand-new tool into our toolbox and prove its value," said Dr. Mustard. "We saw things in the VNIR and SWIR spectral ranges that no one has seen before, and our overall body of knowledge is more expansive because of it." Hyperspectral imaging collects ALL the spatial and spectral data within the field of view, not just some of it (as is the case with multi-spectral).

And what about closer to home, here on earth? Hyperspectral imaging is becoming more mainstream and affordable so that research entities like Dr. Mustard’s group at Brown can tackle projects like these more readily than ever. Graduate student Rebecca Greenberger has done similar hyperspectral analysis on rock and geological formations that many of us drive by without glancing twice. "There’s a rock formation behind a Target store in Connecticut that is just loaded with incredible geological samples," said Greenberger. Many of those collected rock specimens have themselves been scanned with Headwall’s hyperspectral instruments, yielding spectacular results and new information about the geological history of our planet.

Tags: hyperspectral, Brown University, Martian meteorite, geology

Hyperspectral Takes Wing Over Ontario!

Posted by Christopher Van Veen on Thu, May 01, 2014

UASUnder cloudless skies in Ontario recently, Headwall achieved a very notable milestone: we became the first to fly both hyperspectral and LiDAR aboard a small, fully integrated handheld UAS. The test flights not only verified the reliable airworthiness of the system but also the ability to collect valuable hyperspectral and LiDAR data in real time.

Integration is key, because all of this specialized data-collecting instrumentation needs to fit the payload parameters with respect to size and weight. With UAS systems shrinking in size and weight, payloads need to follow suit. As prime contractor for this complete airborne system, Headwall is able to get end-users up and running quicker than ever. Time to deployment is reduced by months thanks to the work Headwall is doing to engineer optimized solutions that meet specific remote-sensing needs.

“The variety of applications for this type of integrated airborne system are numerous,” said Headwall CEO David Bannon. “Precision agriculture is a key one we’re seeing on a global scale, but geology, pipeline inspection, environmental research, pollution analysis are others.” Today’s UAS is smaller, lighter, and more affordable than ever, which makes it a perfect platform from which to carry precise imaging instruments such as hyperspectral and LiDAR. “We’ve always been a pioneer in the area of small hyperspectral sensors for just these kind of deployments,” noted Bannon. “Our strength comes from understanding what our users want to do and then engineering a complete airborne solution that meets that need.”

Chris Van Veen, marketing manager at Headwall, was on site to record and document the test flights. “A fully integrated package like this represents a new frontier for remote-sensing scientists who now have an airborne research platform that goes wherever they do,” says Chris. “Watching this fly and collect data in Canada was a thrill because it was visible testimony to all our integration work.”

The entire payload aboard this particular UAS is less than ten pounds, which includes hyperspectral, GPS/IMU, LiDAR, and computing hardware. Besides making sure these elements are small and light enough, the challenge of integrating everything with an eye toward battery lifetime is also Headwall’s to manage. “We know our remote-sensing users have very important work to do, and they need sufficient power not only to fly but also to operate the instruments,” said Bannon. One way to meet this challenge head-on is to make sure the hyperspectral sensor provides a very wide field of view with precise imagery from one edge to the other. “If you can assure outstanding image-collection across a wide field of view, and then provide orthorectification of that data, you’re covering more ground for each flight swath.”

Fundamental to accomplishing this is Headwall’s approach to optics, which is both simple and elegant. “Our diffractive optics approach uses no moving parts, which, in an airborne application, means robustness and reliability,” said Bannon. Inside each Micro-Hyperspec sensor is a precise and small holographic diffraction grating that manages incoming light with exceptional fidelity. These sensors are ‘tuned’ for the spectral range of interest to the user. “Depending on what the user wants to ‘see,’ he may need a VNIR sensor that operates from 380-1000 nanometers,” said Bannon. The spectral signature of a certain disease condition on a crop tree will determine the spectral range of the sensor, for example. Headwall has also introduced a wideband VNIR-SWIR sensor package that covers from 400-2500 nanometers. This co-registered hyperspectral instrument will be very popular with users who need broad coverage but need a small, light, and affordable instrument to do it with.

The following video will give you a peek into how flight testing went in Ontario.

Tags: hyperspectral, Airborne, Remote Sensing, UAS, UAV, agriculture

History Made, History Seen with Hyperspectral Imaging

Posted by David Bannon on Mon, Mar 03, 2014

As the market for hyperspectral sensing technology moves forward and advances, Headwall’s Application Engineering team has been able to gather a rare view into the past through the hyperspectral scanning of some of the most important historical artifacts and papers in the United States. For the first time ever, hyperspectral VNIR and SWIR imaging was conducted on key historical documents from the US Civil War period.

The Gettysburg AddressBy working collaboratively with the researchers in the Cornell University Division of Rare and Manuscript Collections and the Cornell Johnson Museum of Art, Janette Wilson and Kwok Wong of Headwall’s Application Engineering team spent a few days conducting VNIR and SWIR hyperspectral scans of some of the most important artifacts held by Cornell University. Of particular interest was the hyperspectral scanning of the University’s collection of original Lincoln documents signed by president Abraham Lincoln during his presidency. This collection included the Gettysburg Address (seen at left), the Emancipation Proclamation, and the 13th Amendment to the Constitution.

The scanning of documents and artifacts with hyperspectral imagers is particularly well suited for the purposes of both 1) research and 2) for establishing a baseline of spectral/spatial information for monitoring change in the artifacts to better preserve objects of cultural heritage.

For a couple main reasons, hyperspectral imaging is particularly appealing to collection-care experts. First, and probably most important, is that the technology is non-destructive. The instruments don't interface with the documents and the lighting is called 'cold illumination.' That is, there is no risk of themal damage to the items under inspection. Second, previously unseen features immediately 'come to light' when viewed hyperspectrally. Note the image below, which represents a stamp on the Gettysburg Address that cannot be seen visibly but can when looked at within the VNIR and SWIR spectral range. Collection-care experts are fascinated by unseen features, which can be used to build the body of knowledge with respect to documents or artifacts.

Unseen Features

Tags: hyperspectral, SWIR, VNIR, Cornell University, artifacts, documents, Gettysburg Address

Hyperspectral Sensors for UAV Applications

Posted by Christopher Van Veen on Wed, Feb 19, 2014

The scientific research community is beginning to understand and embrace hyperspectral imaging as a useful tool for a few primary reasons. First, sensors are more affordable than ever. Originally conceived as multi-million-dollar ISR platforms for defense applications, hyperspectral imagers have been successfully ‘commercialized’ over the past few years. Scientists typically embracing RGB or multispectral technology before can now acquire hyperspectral sensors at affordable price points.

Hyperspectral sensors of the ‘pushbroom’ type produced by Headwall require motion to occur. That is, either the sensor flies above the field of view, or the field of view moves beneath the sensor. For UAV applications, Headwall’s small and lightweight Micro-Hyperspec is the platform of choice. Available in the VNIR (380-1000nm), NIR (900-1700nm), and SWIR (950-2500nm) spectral ranges, the sensor is truly ‘SWaP-friendly.’

Spectral range is often where the decision-making starts. The chemical fingerprint—or spectral signature—of anything within the field of view will lead the user in one direction or another. For example, a certain disease condition on a tree canopy may become ‘visible’ within the SWIR spectral range (950-2500nm). Similarly, a certain mineral deposit may become ‘visible’ in the VNIR range (380-1000nm). One approach to ensuring the spectral ‘fidelity’ of images collected by the sensor makes use of ‘diffractive optics’ comprising aberration-corrected holographic gratings. This ‘Aberration-corrected concentric’ design is shown below.

concentric imager

There are several advantages to this ‘reflective’ approach. First, the design is simple, temperature insensitive, and uses no moving parts. This assures robustness and reliability in airborne situations. Second, diffraction gratings can be made very small so that the instruments themselves can be small and light; in other words, capable of fitting the new class of lightweight, hand-launched UAVs. Third, the design optimizes technical characteristics that are most important: low distortion for high spatial and spectral resolution; high throughput for high signal-to-noise; and a tall slit for a wide field-of-view. Because the design is an all-reflective one, chromatic dispersion is eliminated and excellent focus is assured across the entire spectral range.

Many within the environmental research community and across ‘precision agriculture’ prefer to use UAVs as their primary airborne platform. They are more affordable than fixed-wing aircraft and easy to launch. But as UAVs get smaller and lighter, so must the payloads they carry. And integrating the sensor into the airframe along with other necessities such as LiDAR, power management/data collection hardware, and cabling can be a daunting task (Figure 3). Orthorectification of the collected data is another key requirement, which is the means by which the hyperspectral data cube is ‘managed’ into useful information that has been ‘corrected’ for any airborne anomalies. In other words, the collected hyperspectral data needs to be ‘true’ to what’s actually within the field of view.

 Micro Hyperspec

Acquiring a UAV and a hyperspectral sensor won’t assure compatible performance, and a high level of ‘integration work’ is needed. The UAV community and the hyperspectral sensor community are both challenged with pulling everything together. Recognizing this, Headwall Photonics is taking an industry-leading position as a supplier of fully integrated airborne solutions comprising the UAV, the sensor, the power and data management solution, cabling, and application software. The result is that users are flying sooner and collecting better hyperspectral data than ever before.

Type of UAV is very often one of the first decisions a scientist will need to make. Fixed-wing and multi-rotor are the two general categories, with numerous styles and designs within each. In-flight stability and flight-time duration are both paramount concerns, and this is where payload restrictions will often point toward one or the other. Multi-rotor UAVs launch and land vertically, so this type will be favored in situations where space is tight. Conversely, a fixed-wing UAV requires suitable space to launch and land but can provide longer flight duration and carry a heavier payload. The wide field-of-view characteristic of the concentric imager allows a UAV to ‘see’ more ground along its flight path.

Integrated airborne package

Two other key areas managed through Headwall’s integrative process are data management and application software. While a separate subsystem is used to control the sensor operation and store the hyperspectral data, the direction is clearly toward on-board integration of these capabilities. Flash storage and solid-state drives will soon make it possible for the sensor to ‘contain’ all the related functionality that now needs to be contained in a separate module. This will clearly lighten the overall payload, reduce battery consumption, and boost airborne flight time.

Headwall’s Hyperspec III software represents a complete, modularized approach to the management of hyperspectral data. Orthorectification is one such module within the software suite that removes the unwanted effects airborne behavior. The resultant orthorectified images have a constant scale wherein features are represented in their 'true' positions. This allows for the accurate direct measurement of distances, angles, and areas. Other aspects of the software suite can be used to control GPS/IMU devices, control multiple sensors simultaneously, and save polygons (A Google-map-enabled tool that allows the user to define geographic coordinates).

 

 

Tags: hyperspectral imaging, hyperspectral, Airborne, Remote Sensing, Micro Hyperspec, agriculture, diffraction gratings, precision agriculture

Bone-Dating Using Hyperspectral Imaging!

Posted by Christopher Van Veen on Thu, Aug 29, 2013

Headwall recently completed some fascinating demonstration work on behalf of the Conservation Manager and several colleagues at London's Natural History Museum.

One of the hallmarks of hyperspectral imaging is its ability to non-destructively and non-invasively collect an invaluable amount of spatial and spectral data from any sort of reflected matter within the field of view. In commerce and environmental studies, hyperspectral imaging is a valuable and well-known tool that can ‘see’ the unseen.

composite bonesForensics is another exciting area of research. Take 300,000-year-old Neanderthal human bones, for example. Or a 300-year-old snake skin. Or a 400-year-old book of poems. Here you see two bones within the field of view of Headwall's VNIR starter kit. The smaller one is 'only' 200 years old; the larger is 300,000 years old. But the beauty of hyperspectral sensing is that it can classify and compare specimens like these with a tremendous amount of precision, yielding a level of scientific analysis that museums and 'collection-care' experts crave. The demonstration that Headwall performed was an exciting opportunity to show off not only the capabilities of the sensor, but also the capabilities of our new Hyperspec III software. The Conservation Manager was extremely excited with the results of the demonstration. He even remarked that his museum would like to embrace and move forward with the opportunity to be a 'Centre of Excellence for Hyperspectral Imaging,' with Headwall as its sponsor.

Spectral ‘fingerprints’ contain a tremendous amount of useful data, and hyperspectral instruments can see these fingerprints and then extract meaningful data regarding the chemical composition of anything within the field of view. More helpful still, these instruments work in tandem with known spectral libraries that allow a very high degree of selectivity and discrimination. If you know the spectral fingerprint associated with a particular chemical, you can reference it against the hyperspectral data cube coming from the sensor. That fingerprint, once found, very often will be a ‘predictor’ of something else. Disease conditions in crop trees, for example, or the presence of certain inks or pigments on a document or artifact. That’s why precision agriculture and document verification are two other common deployment areas for hyperspectral imaging.

Tags: hyperspectral, Natural History Museum, Headwall, bone-dating, forensics, spectral analysis