Headwall Photonics Blog

Nano-Hyperspec, PrecisionHawk Impress!

Posted by Christopher Van Veen on Wed, Jul 06, 2016

Steven Sexton is Technical Consultant at Aerial Imaging Services, LLC (Ephrata, WA). With broad availability of new UAVs and high-performance hyperspectral imaging sensors, Steven's business is a good one. 'Remote sensing' is the study of agriculture, climatology, geology, and infrastructure from airborne platforms. The amount and quality of image data the sensors collect is amazing, allowing scientists to make important decisions about crops, plant health, mineral deposits, and environmental trends.

Recently, Steven teamed up with Precision Hawk (Raleigh, NC) and Headwall Photonics to put one of these 'flying laboratories' into the air. Because the combination of UAVs and specialized sensing instruments is still 'new' to many users, ease of integration and great customer support from Precision Hawk and Headwall allowed Steven to get into the air collecting data-rich images of the ground below. Precision Hawk took care of many of the airborne issues while Headwall addressed the hyperspectral side of the application. Together, both companies helped Aerial Imaging Services reach a very impressive level of differentiation in a still-emerging business. The myriad of mechanical, electrical, optical, and aerodynamic considerations can be daunting, and Steven took to LinkedIn on June 28, 2016 to tell his story:

*********************************************************************************

I am going to shift focus to sensors today. I recently acquired a Nano-Hyperspec Sensor from Headwall Photonics and PrecisionHawk. This sensor is absolutely amazing, and is configured to just plug right onto the Lancaster Rev 4 and the Lancaster 5. This Plug and Play setup is how all of the sensors PrecisionHawk sells. Making it extremely easy to do a visual scan, land, change to a  BGNIR sensor and fly. They sensors scan at much higher resolutions then most multispectral sensors. Around here being at a higher altitude means not running into the trees lining fields, silos, buildings etc. I get 1.5cm per pixel at 100m or 329 feet AGL. I go lower for Lidar and thermal to 60m or 196 feet AGL.

Now this may seem a little high but with the higher resolution sensors I got from PrecisionHawk, it just made sense, less worry and more time to just watch the Lancaster do its thing. It also means I don't have to make as many passes over a field like I would at lower altitudes. Now I haven't heard anyone say the resolution of some of the newer multispectral sensors that recently have come out or those that I may not know of. If you use one of these, and it gets as high of resolution, please, either leave a comment and share your results or message me so I can add that information to this article. I want to give everyone a fair shake here.

steve_sexton.jpgNow back to the Headwall Photonics Nano-Hyperspec® sensor. This unit is a little heavier than most of my other sensors, The LiDaR is about as heavy. The reason for the extra weight is a 500gb SSD drive attached to it. It also has a network cable interface to hook to your computer or laptop. Please read the manuals that come with it, it will save you a lot of headaches trying to figure out how to access the SSD on the sensor. You can find general information at the Headwall site here.

The customer service from Headwall is absolutely amazing. I decided to update the Nano driver software and missed one step and wound up not being able to access the data. Now this was totally my fault, I kind of went in blind to do the update.

Greg Chenevert from Headwall was extremely helpful and had me try a few things. These didn't work, but Greg spent time trying to help me get things going and guaranteed that they would get it back up. He put me in contact with one of the companies programmers, I gave him remote access to my system with the Nano hooked up and running and he had it all set up, reconfigured and doing imaging within probably 15 minutes at the most. Now I don't know about the rest of you, but most companies don't even come close to the customer service of Headwall and PrecisionHawk. They went way above and beyond to get the sensor working so I could do my job.

The unit itself is not that big, it is the bracket and connectors for the plug and play that make it seem larger. On average I can swap out a sensor and battery in about 30 to 45 seconds. The battery only lasts for about 35 minutes with the heavier load, but I get some pretty amazing images with it.

Headwall has software that accompanies the sensor that are very useful and allow you to transfer the files to a local drive on your computer or laptop. You could even transfer it to a USB drive if you have one that can hold the amount of data you get. There is also an option to view the data in NDVI, now this can be done in the field if you so desire. I usually just bring it back to the office and process it there and add it to other data sets I have gathered on that particular job. It does make the farmer happier if you can show it in the field.

Lancaster.jpgWhen I first started I was unsure of which sensors I should purchase. I imagine several of you have or had the same issue. I determined that if I only get the sensors for agriculture then I am going to be very poor during the winter months. I decided to add the LiDaR, thermal, and the HeadWall Nano-Hyperspectral sensor. This gives me the ability to do other types of work during the non growing season. I also don't mind travelling to a location or even going to another area for several weeks at a time so this also opened up income opportunities.

The data is only as good as your sensors are. Sure the higher quality imagery costs a bit more, but, it also means the data is going to be more precise. Combined with the DataMapper Algorithms you get a very complete package from one source.  

Tags: Remote Sensing, UAV, precision agriculture, Nano-Hyperspec, PrecisionHawk

Hyperspectral Imaging & Processing Takes Hold in Turkey

Posted by Kevin Lynch on Mon, Jun 06, 2016

Hyperspectral imaging is a very valuable scientific technique for a wide range of applications that extend far beyond the defence and military uses the technology was originally conceived for. These include airborne precision agriculture, advanced machine vision, forensics, medical science, cultural preservation, and a range of others. For scientists in these application areas, hyperspectral imaging is often a new tool where the need for education is quite common. As a leader in the field of hyperspectral imaging, Headwall partners with its international resellers to sponsor workshops and events that are geared toward promoting an understanding of hyperspectral imaging and how best to use the rich data coming from the sensors.

Headwall and its valued partner in Turkey (Visratek) recently sponsored a "Workshop on Image and signal Processing" organized by IEEE GRSS Turkey Chapter under the "24th Signal Processing and Communications Applications" (the abbreviated Turkish name is SIU).  Held annually since 1993, SIU has organized in various cities throughout Turkey. In 2016, the location was in Zonguldak, sponsored by Bulent Ecevit University. The events are very well attended with the 2015 SIU attracting more than 700 submitted scientific papers written in Turkish with an abstract in English. A technical program committee of more than 600 experts provided more than 1,500 paper reviews. Attendence is roughly 600 leading scientists, researchers, and industry practitioners from Turkey. Says Visratek head Fatih Ömrüuzun, "This is the event where visibility is very important; our linkage with Headwall and our sporsorship of SIU puts us in a very good position as expert in this field."

Visratek_1.jpg

Visratek_2.jpg

The IEEE SIU Conference is therefore the most significant scientific event of the signal processing and communications community in Turkey. The conference is four-day event. The first three days are dedicated to keynote talks from distinguished lecturers from worldwide universities. Also, tutorials and technical sessions include oral or poster presentations of scientific papers. 24th IEEE SIU conference also featured an industry forum and exhibition program where Headwall and Visratek were prominent. The fourth day of the conference was dedicated to the social programs including day trips to beautiful and historical towns such as Amasra, Safranbolu and Kdz. Ereğli) at the coast of the West Black Sea Region in Turkey.

Tags: hyperspectral, signal processing, Visratek, IEEE, Turkey

Community Partnership in Action!

Posted by David Bannon on Thu, Apr 28, 2016

Great companies can serve not only as an economic engine for customers, employees, suppliers, and business partners but also as a strong community partner based on how that company integrates itself into the communities in which it operates.

So many times, employees follow the same morning routines to get to work – a cup of coffee on the go, a 25 minute commute to work, pull into the parking lot, and do your thing every day. But picking your head up and looking at the community and the surrounding neighborhood as you go to work every day reveals a lot and helps to define where we can have a positive social impact as a company and as an individual.   This perspective helps us to understand our community’s strengths, resources, cultural diversity, as well as a heightened awareness of community need.

community_partnership.jpgA cornerstone of strong, thriving companies is to establish a brand as a good community partner; defining a commitment to social responsibility and community philanthropy needs to be more engrained in the cultural fabric and strategies of companies.

As an element of our FY16 success plan, Headwall has undertaken a new community service initiative with our employees. A group of employees conducted site visits to a number of community service organizations. As a result, Headwall is working to support Our Father’s House, a local non-profit organization based in Fitchburg that is providing shelter and transitional services to homeless men, women, and children in the community.

VOLUNTEER_PROJECT.jpgAdditionally, a group of five volunteers from Headwall went to a work project for Growing Places, a local organization that runs community gardens and teaches gardening classes to residents of Fitchburg and Leominster. The group met at the Growing Places warehouse in Leominster, where the first project involved power-washing three tanks to be used for storing water at the community gardens. One tank had been full of black dye, another contained glue, and a third one held corn syrup. 

Everyone involved got stained with the black dye, and were ultimately unsuccessful in cleaning that tank. In the end, two tanks were cleaned and delivered to the Sundial Community Garden, and the Cleghorn Community Garden, both in Fitchburg. You can read more about the projects here.

The work was challenging, but much of the danger was mitigated by putting the most dangerous engineer in a cage. The two cleansed tanks were loaded into a truck and delivered to the community gardens. The first garden was surrounded by a fence that the volunteers had to lift the tank over. After both tanks had been installed at the community gardens, the group moved on to a third site, adjacent to the Cleghorn Garden, to plant two large apple trees that had wintered over but were lying on their sides.

The trees were removed from the shallow holes in which they had spent the winter, the burlap and wire baskets were removed from the root balls, and two large holes were dug. A full group effort, led by Kevin Didona, successfully placed the trees upright in their new homes. The dirt was replaced, and the community now has two lovely apple trees.

The staff at Growing Places was very pleased with all we were able to accomplish, and there will be opportunities for more cooperation between that group and Headwall Photonics in the future. We have discussed the possibility of a project at the Headwall facility where volunteers can build frames for raised beds at the community gardens.

 

Tags: Fitchburg, community partnership, Volunteering, employee activities

Landmine Detection Using Hyperspectral Imaging

Posted by Christopher Van Veen on Wed, Apr 20, 2016

 

 

When you see how rapidly the use of drones for scientific research has risen, you first conclude that it's all about precision agriculture and climatology. To be sure, those are trendsetting applications when it comes to using hyperspectral imaging sensors aboard UAVs. But over at the University of Bristol, two scientists are leading a team focused on 'finding a better way' to detect the presence of landmines that kill or maim thousands of people annually. With around 100 million landmines underneath the ground globally, traditional means of finding and eliminating them would take about 1,000 years and cost upwards of $30 billion according to some estimates.

Find A Better Way is a UK-based NFP founded by famed footballer Sir Bobby Charlton. Despite his heroic sporting achievements, Sir Bobby is now forging a legacy outside of football through his determination to champion the cause of landmine detection and elimination. He witnessed the destruction caused by landmines on visits to Cambodia and Bosnia as a Laureus Sport for Good Ambassador. He founded Find A Better Way after recognizing that research and development held the key to making the major changes necessary to allow humanitarian teams to rid the world of the threat of landmines.

"We want to do something that very quickly delivers a step-change in capability while reducing overall human risk involved with finding and eliminating landmines," said Dr. Tom Scott of the University of Bristol. He along with Dr. John Day are pairing their advanced UAV with small and lightweight hyperspectral imaging sensors from Headwall Photonics to 'see' with a specificity and resolution unheard of only a few short years ago. "These drones can be autonomously deployed to fly over a landmine area and provide high-resolution images that allow us to reconstruct the 3-D terrain with very high accuracy," said Dr. Scott. With all the landmines across the world, tactical deployment of numerous low-flying drones is going to win the day over expensive satellites or high-flying aircraft. In order to meet this objective, the package needs to be simultaneously affordable, light, and suited to its mission. There is a vast amount of integration and testing work involved before the first meaningful flights can be flown. Recognizing this, Headwall is assuming much of this work so that users can compress this time-to-deployment significantly. Because the use of drones for scientific research is still in its infancy, misconceptions abound. Acquiring a UAV and slapping a hyperspectral sensor to it without first considering all the variables is a recipe for disaster. This holds true whether the mission is landmine detection or precision agriculture. More commonly, other instruments such as LiDAR and GPS are part of the payload as well. The end result is a carefully balanced exercise in aerodynamics, optics, electrics, and data-collection that companies such as Headwall are able to manage.

The human eye can only respond to wavelengths between roughly 390-700nm. Many of the reflective 'signatures' given off by plants and chemicals fall outside that range. For example, the Nano-Hyperspec sensor used by the Bristol team operates in what is called the 'Visible-Near-Infrared' range of 400-1000nm (called 'VNIR' for short). In that range, the sensor is 'seeing' with an extraordinarily high degree of specificity and resolution and far beyond what a human could discern. Indeed, these sensors are collecting an astounding amount of spectral data on a per-pixel basis, resulting in 'data cubes' many Gigabytes in size. Armed with spectral libraries that faithfully characterize the specifics of the terrain below, scientists can match the known library information with the collected airborne data and make quite accurate calls on what's what.

The use of drones has accelerated this scientific effort because of two factors: First, they are affordable and easy to deploy on a tactical basis. One can be packed in a Range Rover and deployed almost anywhere in minutes whereas aircraft and satellites are, by nature, constricted, inflexible, and costly. This is not to say that aircraft and satellites will be supplanted by UAVs. There is valuable image data that can be collected using these high-flying platforms, and the overall knowledge base shared by the scientific community is made much more complete when all these assets are used in a synergistic fashion. Indeed, hyperspectral imaging was once the province of high-flying reconnaissance planes and satellites...neither of which could ever be used economically by university scientists. But the ubiquitous drone--a bane to some and a blessing to others--is the perfect platform from which to launch these exploratory efforts. "We're adding a bit more science to the UAV payload now," says Dr. Day. "We're starting to look at the spectrum of light and the colors of light that are coming off the minefield and using that data to find where the landmines are."

UAVs and drones seem to get media attention for all the wrong reasons, which is exactly why efforts by the esteemed team at the University of Bristol are to be applauded for developing a 'Better Way' to solve some of our toughest challenges. Hyperspectral imaging sensors can 'see' even beyond the VNIR range of interest to Dr. Day and Dr. Scott. The Shortwave-Infrared (SWIR) range starts near where VNIR leaves off, covering around 950-2500nm. The presence of certain chemicals, minerals and of course plant photosynthesis will become visible to sensors like these. Indeed, a broadband sensor package that covers the VNIR and SWIR range (400-2500nm) is particularly useful because it basically collects everything a scientific research effort might wish to see.

There are two key factors about hyperspectral imaging that are worth noting. The first is that the technology depends on 'reflected light.' The sensor is basically looking at how sunlight reflects off certain materials. Plant fluorescence, for example, has a particular 'spectral signature' that a sensor can understand. Obviously, this means an airborne hyperspectral sensor depends on a healthy amount of solar illumination and certainly is useless at night. But Headwall's sensors are designed to collect precise image data even under less-than-ideal solar conditions (cloud cover, or low angles, for example). The second factor is having a wide field of view. The sensor obviously can 'see' directly beneath the line of flight, but being able to do so off to the wide edges of the flight pattern makes the mission more efficient. Batteries being what they are, optimizing the flight duration by capturing a wide swath of land is obviously beneficial. This benefit is seen in the precise optical layout used by Headwall in the construction of each sensor.

Crop science, climatology, geology, and even the inspection of infrastructure such as pipelines and rail bed depend on imaging sensors like those produced by Headwall. Hyperspectral sensors depend on 'motion,' since they basically collect images slice by slice as the UAV flies over the scene. The combination of all of these high-resolution 'slices' comprise what is known as a 'data cube,' which is pored over by scientists during post-processing. Of course, the hardware capturing these images represents about half the story. The other half can be found in the software that makes sense out of reams of spectral image data that all needs to be 'geo-tagged' and orthorectified. First and foremost, scientists need answers; the data (and the sensor collecting the data) are simply means to an end. When you go to your local DIY or Lowe's or Home Depot, you really aren't buying a drill; you're going there to buy a hole.

But is all that image data really needed? Some efforts seek to cut corners by using less-capable 'multispectral' sensors that cover only a few bands rather than the hundreds of bands covered with hyperspectral. Using crop science as an example, a multispectral sensor might miss the telltale signature of an invasive disease on a tree canopy while hyperspectral will most certainly catch it. And that can mean the difference between saving a coffee bean harvest or a valuable wine-vineyard crop.


 

Tags: hyperspectral, Remote Sensing, UAS, UAV, University of Bristol

Data Fusion: A New Capability for the Remote Sensing Community

Posted by Christopher Van Veen on Tue, Mar 01, 2016

We’re seeing a tremendous increase in the number of airborne deployments for our hyperspectral imaging sensors. To a large degree, the trend toward smaller and more affordable UAVs is giving the remote sensing community more flexibility to undertake more missions to capture meaningful environmental data. From wine-grape vineyards in northern California to coffee bean plantations in South America, the precision agriculture community is embracing packaged ‘UAS’ offerings that combine a UAV matched to the payload it needs to carry.

hypercore_illustration.jpgCollecting meaningful, actionable data for a precision agriculture scientist can mean the difference between a healthy harvest and a disastrous one. Depending on the wavelength, the sensors will spot indices indicative of diseases, irrigation deficits, crop stress, and more. An affordable UAV thus takes the place of much more expensive manned aircraft flights. The financial savings notwithstanding, this new system can be hand launched and retrieved and basically deployed wherever and whenever (with adherence to all local aviation rules and regulations).

One trend we’re seeing at Headwall is the integration of multiple sensors, each having their own specific streams of data. For example, the payload might comprise a VNIR (400-1000nm) sensor along with a SWIR (1000-2500nm) instrument, but might also include LiDAR and typically a GPS/IMU. A Fiber-Optic Downwelling Irradiance Sensor (FODIS) is also often used to measure and collect data relative to changes in solar illumination.

Obviously, payload restrictions determine what the craft can lift and for how long. But it is a balance between choosing an affordable UAV that is small and light while understanding that it might not be able to carry all the instruments that a remote sensing mission might demand. Optimizing Size, Weight & Power (SWaP) is the guiding principle for missions involving UAVs. There are many fixed-wing and multi-rotor UAVs on the market that all specify their payload restrictions and flight durations.

The goal of any remote sensing activity is to see the unseen, and then make sense of the data during post processing. Because this data leads to important agricultural decisions, it’s crucial to synthesize the data from each instrument. The term for this is data fusion, and Headwall has just unveiled a new product called HyperCore™ that handles this important task. As the UAV flies its mission, data streams from the hyperspectral sensors, GPS/IMU, LiDAR, and other instruments are all collected on HyperCore’s 500GB drive for easy download (via Gig-E) later. HyperCore includes the most-used connections, including two Gig-E ports, a CameraLink port, power, and two I/O ports.


 

Tags: Airborne, Remote Sensing, UAS, UAV

Hyperspectral Imaging Technology a New Frontier for KAUST in Saudi Arabia

Posted by Christopher Van Veen on Thu, Oct 08, 2015

King Abdullah University of Science and Technology (KAUST) is a public research institution in Saudi Arabia. By any measure it is a very young university, founded in 2009. But in that short span of time it has rapidly grown to accumulate an astounding number of research and citation records.

Being a science and technology university, KAUST focuses on traditional subjects such as math, electrical engineering, and computer science. KAUST is already well versed in spectroscopy, having a burgeoning lab full of instruments that can peer into the chemical underpinnings of minerals, plants, and crops. The lab is outfitted with spectroscopy, chromatography, and mass spectometry instruments tasked with learning more about trace metals analysis, wet chemistry, and surface analysis.

But one area of study is on pace to become its most popular: Earth and Environmental Sciences. Here, students and faculty pore over ways to use new technology in learning more about precision agriculture, water resources, and atmospheric conditions. This takes spectroscopy out of the lab and into the air, specifically using drones and UAVs.

KAUST_Nano_1_with_caption_small.jpg

Spectroscopy takes several different forms, but multispectral and hyperspectral are of primary interest within the scientific research community. The primary difference between the two is that hyperspectral imaging technology provides complete spectral information for every pixel in the scene...literally hundreds of bands. Comparatively, multispectral will only detect a handful of bands. In other words, it can mean the difference between discovering an invasive disease in a crop field or vineyard and missing it altogether. The Headwall sensors have a wide field of view (FOV) coupled with aberration-corrected optics, meaning that image data is as crisp and precise along the edges of the FOV as it is directly underneath the flight path. More data is collected for each mission, making the data-collection project more efficient. With battery life being key, efficiency matters.  Armed with spectral libraries that define the chemical composition of everything the sensor ‘sees,’ scientists have the ability to look well beyond what is actually ‘visible.’ 

Matthew McCabe, a professor at KAUST, recognizes the value of real-time environmental analysis in the work his Hydrology and Land Observation (HALO) Group does. “We are interested in exploring hyperspectral sensing to better understand plant health and function, particularly as relates to agricultural settings,” said McCabe. “The capacity to retrieve information on plant health is of interest not just for the obvious and important monitoring of crop state, but also in better constraining coupled water-energy-carbon models of vegetation systems.” According to McCabe, these models are generally poorly constrained so a system that enables for investigating plant health and condition in near real-time is of much interest. 

Already familiar with less precise multispectral instruments, McCabe is now investing in airborne hyperspectral sensors that see more and can deliver vast amounts of spectral data. “While we already employ multispectral sensors covering bands known to inform upon plant systems, it is the capacity to further expand our knowledge of plant spectral response that we are most interested in,” said McCabe. “Determining new spectral relationships in plant behavior and response is an area of research that hyperspectral imaging can really drive.”

KAUST_Nano_2_with_caption_small.jpg

The field of crop science is a key deployment for hyperspectral imaging. Here, McCabe chose Headwall’s Nano-Hyperspec sensor that covers the core Visible/Near-Infrared (VNIR) range from 400nm to 1000nm. Most anything that a crop scientist wants to ‘see’ will be found in that VNIR range. “We wanted a very precise spectral imager that operated in this important VNIR range,” said McCabe. “We needed to ‘see’ spectral information for every pixel within the field of view, and we knew that Headwall’s Nano-Hyperspec had very precise edge-to-edge imaging performance that would optimize the flight efficiency of our UAV.

With the newly acquired Nano-Hyperspec mounted aboard a KAUST-engineered quadcopter, McCabe and his team will set off to learn about plant-specific spectral-traits that provide direct insight into health and productivity functions. “This is one of our research goals,” said McCabe. “Hyperspectral sensing and analysis allows us to explore this exciting area of research in ways that multispectral cannot.”

Choosing a sensor and deploying it properly on a UAV is a challenge, especially since the technology is still relatively young. “We needed a sensor package that matched our UAV,” said McCabe. That meant it had to be lightweight and small, and it had to have integrated data storage for the many gigabytes of spectral data pouring into the sensor while aloft. The Headwall solution added a GPS/IMU and full software control for an overall package that put McCabe in the air far sooner than he’d otherwise expect. “Partnering with Headwall gave us months of headway in terms of getting in the air and collecting great data,” said McCabe.

And why Headwall? “We knew the company was a strong and well respected brand in the spectral sensing domain, with a long history of product excellence,” said McCabe. “We had researched a number of competing solutions, but Headwall turned out to be the most professional and competent among them.” McCabe also noted that numerous colleagues and scientists were currently using Headwall hyperspectral instruments with great success.

Moving forward, what are some of the exciting plans in the area of environmental analysis using Headwall’s hyperspectral system? “Our initial research will be focused on the indirect retrieval of plant pigments such as chlorophyll and carotenoids,” said McCabe. This spectral data will provide information on water use and stress condition of agricultural systems. “Ultimately, we are interested in better understanding plant water use through transpiration, but there are also opportunities for better constraining crop-yield through routine spatial sampling that is available when coupled to a UAV,” noted McCabe. “Hyperspectral sensing allows for a number of innovative ways to explore these ideas. But there are also clear opportunities beyond crop science. Indeed, we see applications across many of the multi-disciplinary research areas we are engaged in.” Interestingly but not surprisingly, McCabe is thinking well beyond UAVs: “We also have active projects in satellite validation and expect to engage with collaborators on both small and larger scale crop stress monitoring and even disease mapping using sensors mounted aboard L.E.O. commercial satellites.”

Of course, Headwall will be there when the time comes.

Click to edit your new post...

Tags: Remote Sensing, UAS

We're Giving Drones a Good Name

Posted by Christopher Van Veen on Wed, Oct 07, 2015

Drones seem to be in the news for all the wrong reasons. The media reminds us that they're nothing but nuisances: peeking at people, crashing into stadiums, hovering over the White House, and causing airliners to take evasive maneuvers. The FAA in this country is taking an active stance on the safe operation of drones, and the topic is being explored elsewhere around the globe. What everyone recognizes is that it's a world full of both promise and uncertainty. Indeed, the automobile was born under the same set of circumstances!

Having just returned from a week-long conference in Reno, Nevada, my post today is meant to emphasize the good work drones can do. The biggest among them is precision agriculture, where a drone outfitted with the right instrumentation can hover over orchards and vineyards and spot telltale signs of diseases that aren't readily seen from the ground. Monitoring irrigation levels and fertilizer effectiveness are two other key applications, as are climatology, pipeline monitoring, and geology.

Two makers of UAVs present at the conference are Headwall customers. PrecisionHawk builds a fixed-wing system while ServiceDrones offers a multi-rotor craft. There are reasons for using either. The amount of room you have to take off and land is one consideration; the overall battery life (flight duration) is another; and payload capacity is a third. The key task is to match everything to the mission, which is why integration is so important.

All told, the packaged technology of drones and sensors allows researchers to 'see' the invisible and learn more about the environment. Primarily this is territory largely inaccessible by any other ground-based means, which puts the risk to humans (and airliners) at the lower end of the scale. The use of drones has exploded for two primary reasons. Chief among them is affordability, which positions them much more favorably compared with manned fixed-wing aircraft. Second is ease of use. Drones are now more 'mainstream' than ever, and their ability to carry reasonable instrumentation payloads allows them to do this kind of scientific 'remote sensing.'

Instruments such as hyperspectral sensors are getting smaller, lighter, and more affordable. With them, scientists can now unlock hidden secrets and spot trends by analyzing very detailed, data-rich images.  We are helping to create a 'new set of eyes' for the scientific community. The drones themselves become a vital 'delivery system,' and the pairing of these technologies is giving birth to the kind of conference such as the ASPRS Mapping event in Reno. It was a combination of test flying and presentations, with the flying happening in gorgeous Palomino Valley located about 35 miles north of Reno.

Through it all, safety was paramount during the flying demonstrations. FAA inspectors were with us every step of the way to make sure that all the programmed flight plans were adhered to. Each drone had 'N' registration numbers, as a regular aircraft would. This is serious business with huge upside potential for geologists, crop scientists, the petroleum industry, and for environmentalists. It pays to understand the regulations and work within them, because this whole business is a 'new frontier' for everyone. And while the term 'Drone' conjures up a rather negative image, the more proper description is, "Unmanned Airborne System," or 'UAS' for short. These truly are 'systems' because they pair a flying machine (either fixed-wing or multi-rotor) with instruments they carry.

And what kind of instruments? For precision agriculture, a hyperspectral sensor covering the Visible and Near-Infrared (VNIR) range of 400-1000nm will spot disease conditions on tree canopies. With entire economies depending on crops (hello, Florida citrus!), the ability to spot tree-borne diseases and other plant-stress situations is massively beneficial. First, the instruments are precise and can spot the 'invisible.' Second, the drones allow for the rapid and complete coverage of remote areas that might take days or weeks to map. And perhaps most telling, some disease conditions will only be visible from the top down rather than from the bottom up. An inspector on a ladder under a tree will likely miss something that the drone spots, and this can mean the difference between a bountiful harvest and a financial catastrophe. Any high-value crop (think citrus, wine grapes, pistachios, coffee beans, walnuts, etc.) needs this kind of imaging oversight. Our Nano-Hyperspec is extremely popular for this kind of work.

When it comes to airborne work, one of the most desired attributes of a hyperspectral sensor is a wide field of view. Simply put, the sensor needs to deliver crisp hyperspectral data at the edges of its field of view just as it would directly underneath the flight path. The wider and more sharp the field of view, the more efficient the flight path can be. And when it comes to drones, battery life determines the overall flight duration. So a hyperspectral sensor having an aberration-corrected wide field of view can cover more ground for a given flight envelope. More image data is thus collected for every flight, making the research project very efficient.

In addition to hyperspectral sensors, drones will also need a GPS to tie the incoming spectral data to its exact geographic location. Another frequently asked-for instrument is LiDAR (Light Detection and Ranging Sensor), which provides some elevation detail that is paired with the hyperspectral data. Obviously the combination of all these separate instruments makes for a payload that consumes valuable weight and space, and thus out of the realm of possibility for today's new breed of hand-launched UAVs. With that in mind, my company (Headwall Photonics, Inc.) takes time to engineer and 'integrate' the sensor so that it is as small and as light as possible. Combining the data storage inside the sensor is one way; direct-attaching the GPS is another. The connecting cables you don't need mean weight you don't have to lift!

Finally, conferences like the ASPRS event in Reno are places where people can learn. Understanding the challenges and potential integration pitfalls is what we at Headwall were there to convey, and our message was very well received. The mistake we all want to avoid is having users blinded by the promise of airborne hyperspectral imaging, dashing off and grabbing any affordable UAV and bolting instruments onto it. For one, such an approach is dangerously naive. Second, the time needed to integrate everything is practically always underestimated. And third, it becomes a very costly endeavor when the price of time is factored in.

At Headwall, although our business is the production of the industry's best hyperspectral imaging sensors, we understand integration issues better than anyone. We're here to help navigate the process and get the scientific research community in the air faster, doing all the good things 'drones' can do.

Tags: hyperspectral, Remote Sensing, Sensors, UAS, VNIR, UAV

Spectral Imaging Solves Mystery of Stolen Books by Suicidal Librarian

Posted by Christopher Van Veen on Tue, Jul 21, 2015

Multi-Million Dollar Theft Of National Heritage, Followed By International Intrigue, Suicide, And An Explosion Injuring Dozens...

Not Hollywood But … Headwall’s Hyperspectral Imaging Sensors Provide Forensic Analysis Help Solve the Mystery & Repatriate a Stolen Rare Book!

The_Book

David Walter Corson is curator of the History of Science collection at the Cornell University Library. Through years of study and procurement, the Cornell collection...35,000 volumes in all...has some esteemed and cherished works written by Sir Isaac Newton, and others, and represents the world’s most extensive university collection on the evolution of scientific thought and research over the few centuries.

One of the books in this vast collection at Cornell was simply known as the ‘Oculus,' written by Christopher Scheiner in 1619. Scheiner was a brilliant geometer, physicist and astronomer, who developed theories of optics which later formed the basis for the development of lenses. Schiener’s book showed that the retina is the seat of vision, and it was a recognized treasure that Cornell was very pleased to have acquired in 1999 from Jonathan Hill, the preeminent New York-based bookseller of rare books and manuscripts. However, it was five years later that the Scheiner Book was reported to have been stolen from the National Library of Sweden, along with around 55 other notable works worth millions.

Unbeknownst to Jonathan Hill or to David Corson, the Library of Sweden theft was a daring one carried out by an employee of the National Library by the name of Anders Burius.  Burius sold the works he stole between the years 1995 and 2004 to German auction house Ketterer Kunst. Burius was subsequently arrested and confessed in 2004. However, while free on bail in Sweden, Mr. Burius attempted suicide by slitting his wrists and then cutting the gas line in his apartment. The explosion finished the deed, flattening the building and injuring scores of neighbors.

The story is widely known throughout Sweden and is recounted in The History Blog, http://www.thehistoryblog.com/archives/17824.

And, yes, it is also the subject of television mini-series.

Where do Headwall’s hyperspectral imaging sensors fit in?

Initially, Headwall’s spectral imaging analysis was focused on the examination of spectral enhancement techniques on Cornell’s extensive collection of Lincoln documents such as the Gettysburg Address, the Emancipation Proclamation, the 13th Amendment to the US Constitution, Lincoln Executive Mansion Letters, and others.

book_blog_image_1

Studying historically significant treasures like Cornell’s Lincoln Collection must be done with great care. The Hippocratic oath for physicians (and, by extension, to curators and conservators) somewhat loosely states, 'First Do No Harm.' Indeed, any sort of spectral imaging must not involve harsh lighting, heat or be otherwise 'invasive.' Assured that it was a safe form of scientific analysis, Cornell teamed with Headwall to carefully image some artifacts, art work, and rare books that can be truly described as priceless treasures of cultural heritage.

Now back to Scheiner’s Oculus book … As Sweden worked with Interpol to track down these stolen treasures, David Corson of Cornell became aware of the book theft and began working with the FBI to determine if the Cornell version of Scheiner’s Oculus was in fact stolen from the National Library of Sweden.

To make a forensic determination, some scientific analysis was necessary which led to Headwall's expertise in hyperspectral imaging and the use of Headwall’s Hyperspec VNIR and SWIR sensors to analyze the book. Headwall’s non-invasive hyperspectral imaging technique yielded a highly resolved spectral and spatial datacube that allowed application engineers to analyze component and constituent material differences in the book such as color change, deterioration, and alterations, as well as the ability to identify disguised text and “under drawings” not visible to the naked eye.

book_blog_image_2

"Your analytic techniques were exactly what we needed," explained Corson. "The totality of the circumstantial evidence that emerged from Headwall's study of The Scheiner Book is, indeed, what ultimately convinced us to 'repatriate' the volume." Through the use of data collected by Headwall’s VNIR and SWIR hyperspectral sensors, Janette Wilson of Headwall’s technical sales team undertook a rigorous PCA approach (Principal Component Analysis) that was able to yield definitive proof that faint, non-visible markings on the book were correlated to the National Library of Sweden’s catalogue system.

"We had independently asked the National Library of Sweden whether there were any unique bookplates or similar identification devices that the library might have used in the past that we should look for," recalled Corson. But there was no apparent evidence of previous bookplates in Cornell's acquired copy, aside from the one Cornell added themselves. But spectral imaging revealed remnants in the corner sections of the front pastedown (a reasonable location for a bookplate) of a previous label. Corson thus had his first bit of evidence: "This finding showed measured dimensions almost identical to those of a bookplate the National Library had told us was once used in their books!"

The challenge before Corson was compounded by the fact that the National Library could not accurately corroborate any of these findings after it was determined that all of its records had been destroyed. "The effect was as if the Library had never even had the titles in its holdings," recalled Corson. In any case, The National Library did send Corson two possible shelf marks for the Scheiner volume, based on their best recollection.  Hyperspectral imaging thus revealed that the sequence of 'marks’ were the same as those provided by the National Library.

In the end, hyperspectral imaging did indeed prove instrumental in uncovering findings previously unknown. “What is significant is that Headwall’s technique and approach can reveal definitive evidence in situations like these,” noted Corson. The technology readily adapts itself for use with paintings, maps, manuscripts, even non-flat artifacts.

Tags: forensics, Cornell University, artifacts, antiquities, Artwork

Hyperspectral Imaging Successfully Screens Cancer Tissue

Posted by Christopher Van Veen on Wed, Apr 29, 2015

Healthcare isn’t a new frontier when it comes to imaging, but hyperspectral technology is opening eyes and capturing spectacular, life-changing results.

Over the past several months and culminating in a very successful technical review in late February, a European collaborative project named HELICOID (HypErspectraL Imaging Cancer Detection) used Headwall’s hyperspectral sensors to discriminate between healthy and cancerous brain tissues. The focus on brain cancer is especially meaningful because this tissue—almost more than any other type of cancer—can resemble the normal surrounding tissue. This makes it difficult to isolate under normal imaging techniques.  Indeed, while the HELICOID project focused on brain cancer, hyperspectral imaging will be beneficial for breast and lung cancer as well.

With technical resources out of the Headwall BVBA office near Brussels and with our application partner in Spain, HELICOID, the Belgium-led medical collaborative, worked closely together to develop the technical solution for the spectral imaging sensor.

Early results derived with the Headwall Hyperspec imaging system are impressive as the hyperspectral sensor system can  “potentially accelerate cancer diagnosis and improve proper cancer removal ultimately saving lives” as reported by Lung Cancer News Today.

One of the hallmarks of hyperspectral imaging is its ability to identify objects or disease conditions based on the chemical composition of tissue within the field of view of the sensor.  By working closely with medical collaborators, the sensors were ‘tuned’ to the precise spectral features of interest that scientists are looking to find. By offering a precise definition of the boundaries of the cancer tissue in real-time, hyperspectral imaging can potentially accelerate cancer diagnosis and improve proper cancer removal. Surgeons can thus remove exactly what needs to be removed while leaving healthy tissue untouched.

“Hyperspectral imaging technology holds enormous promise for medical applications and, as the leader in spectral imaging solutions, Headwall will continue to make significant contributions to advancing industry capabilities,” said Chris Van Veen of Headwall’s Marketing group.

Tags: Cancer Detection

Hyperspectral Takes Old Maps Into New Territory

Posted by Christopher Van Veen on Thu, Mar 26, 2015

Late in 2014, Headwall sponsored a successful event at London’s Natural History Museum. The purpose of the gathering was to introduce curators and preservationists to the advantages and capabilities of hyperspectral imaging. Professionals in this field understand that the treasures under their control...paintings, documents, and artifacts...need to be preserved using the most advanced techniques available. Preservation largely means having an excellent understanding of the chemical composition of the underlying materials used to create the treasures. And what the eye cannot see, hyperspectral imaging can.

The Bodleian Library (Oxford, UK) has been an acknowledged pioneer with respect to the use of spectral imaging technology. While newer than other imaging techniques, hyperspectral is relatively affordable and provides a wealth of image data that experts can pore through. With this data, the overall body of knowledge is exponentially increased on treasures having enormous historical prestige and significance. The identification of specific materials, inks, pigments, and substrates can help determine when (and perhaps even where) a document or artifact was created. Everything a hyperspectral sensor sees can be categorized with respect to its chemical signature, or ‘fingerprint.’ The color ‘Yellow’ resonates a certain way to the eye, but spectral imaging can discern the chemical composition of a particular ‘Yellow’ and match it to known spectral libraries. The results are clearly beneficial to the Bodleian, which is why the Library has taken great measure to partner with Headwall Photonics to implement systems geared specifically to what they'd like to see and learn.

BodleianTwo prized maps at The Bodleian...the 17th-Century Selden Map of China, and the medieval Gough Map of Britain...recently underwent precise analysis using Headwall’s hyperspectral sensor. The Gough Map in particular represents a mystery to Bodleian experts: when was it created, by whom, and why. By illuminating the map with non-invasive, non-destructive ‘cold’ lighting, the near infrared and shortwave infrared sensors collect a digital map of inks and materials. It even highlights features that were deliberately masked and others that simply faded or flaked away over time.

The Bodleian’s David Howell, an early advocate of spectral/chemical imaging and who helped spearhead Headwall’s Natural History Museum event, has been extremely pleased at the results seen thus far. In an interview with the BBC, Howell said that he was “blown away by the data that’s already coming out.” He noted that the technology first and foremost does not put the treasures at risk. The imaging illumination is non-destructive and the treasures themselves do not need to be removed from their climate-controlled premises.

Howell concluded with a plug for the promise of hyperspectral imaging technology: “Our biggest problem now is there’s just so much data to sort through to fully explore what we’ve uncovered!”

To read the BBC article on this exciting venture, click here.

Tags: hyperspectral imaging, artifacts, antiquities, Artwork, artwork preservation