If I asked you to highlight the dirt road on the top of this image, you could do it super easily.
Aerial photo of a patch of dirt and shrubs. Red pixel marks the source.
Now what if I asked you to highlight all the grass and shrubs? Not the dirt, or the rocks, or the road, just the grass. And once you’ve done that, describe how healthy that grass is—not so easy, right? For the curious, this is what it would look like:
Aerial photo of a patch of dirt and shrubs, with the vegetation isolated using a SAM.
So how do researchers figure out what pixels are plants and what are dirt? We use a mathematical tool called SAM- a Spectral Angle Mapper. “Spectral” simply means we’re looking at the light that bounces off the ground, and “Angle Mapper” is a fancy way of asking “how similar are these 2 pixels”. So what we’re doing is picking one small part of the picture that we know is grass, and then comparing the rest of the picture to that small grass part. If a pixel is very similar, then it’s probably plant life. If it’s not similar at all, then it’s probably something we don’t need to analyze.
This is cool, but what can we do with it? For one, we can do further processing on the images now that we can isolate which pixels are plant material- for example, we can easily assess how healthy the vegetation is without worrying about all the interference from non-living items in the picture. Because of this, we can pair SAM with satellite imagery and other tools like NDVI to figure out how nature is changing through time. We have pictures of the Earth going back 40 years- it’d be incredibly useful to see and quantify how things are changing in this almost half-century span of time.
Remember though, it’s not just plant life that SAM can detect. It’s any piece of the image that you want to isolate. Here for example, we wanted to find all the pixels that were made of the tennis court material, and none of the rest.
Left: RGB aerial photo of a tennis court. Right: The court material isolated using SAM. The source pixel (endmember) is highlighted in red on the left image.
This was part of a bigger project that we also used SAM to help with. Two groups, NEON and Headwall Photonics both do remote sensing; they take pictures of the Earth from a distance, and use the data to gain a better understanding of the land. They differ in how they do this though- NEON uses a manned airplane with a large (~3 feet in diameter!) hyperspectral camera. It’s hyperspectral because, unlike your phone which takes pictures of the light we see with our eyes, it also takes pictures in hundreds of other wavelengths of light. Like how a thermal camera can actually see how hot an object is, except NEON’s can see much much more than just heat. Headwall has a similar camera but it’s far smaller (only a couple inches long), and they use an unmanned drone instead of a large piloted aircraft.
There are advantages to both NEON and Headwall’s methods of remote sensing, but they wanted to compare their data against each other. Sounds like a perfect application for SAM! Both groups flew over the same spot, shared their data with Earth Lab for study, and the results were presented at an AGU conference.
Earlier I mentioned NDVI as a method of assessing how healthy plant life is. But how can we determine this without ever touching the plant, taking measurements of the soil, or interacting directly with it at all? As it turns out, plants give off more light than just what our eyes can see, and this light can directly tell us if the vegetation is healthy or if it’s dying, water deprived, nutrient deprived, etc. The math is actually not too difficult, and as a coding exercise I decided to program a simple NDVI calculator in R and Python. You should pick your favorite language and try it too! There are plenty of publicly available datasets with near-infrared pictures.
As you can see the snow and wooden railing is purple, meaning it’s either not a plant or it’s entirely dead. The mountainside, even though it’s covered in shade, is bright green which shows it has healthy plant life on it. These are just two of many tools that remote sensing scientists and Earth Lab use to better understand our planet!