Multispectral and hyperspectral images allow us to ‘see’ light bandwidths not visible to the human eye.
This advanced imaging opens up new ways for us to understand the world, and has opened up hundreds of important new drone applications, especially in the fields of agriculture, ecology, oil and gas, oceanography and atmospheric studies.
We’re often asked to explain the difference between multispectral and hyperspectral imagery. Here’s a good explanation from GIS Geography:
Reading this post, your eyes see reflected energy. But your computer sees it in three channels: red, green and blue.
- If you were a goldfish, you would see light differently. A goldfish can see infrared radiation which is invisible to humans.
- Bumble bees can see ultraviolet light. Again, humans can’t see ultraviolet radiation from our eyes.
Now, imagine that we could view the world thought the eyes of a human, goldfish and bumble bee? This is what multispectral and hyperspectral sensors allow us to do.
The Electromagnetic Spectrum
Visible (red, green and blue), infrared and ultraviolet are regions in the electromagnetic spectrum. We made up these regions to conveniently classify them. Each region is categorized based on its frequency (v).
- Humans see visible light (380 nm to 700 nm)
- And goldfish see infrared (700 nm to 1mm)
- Bumble bees see ultraviolet (10 nm to 380 nm)
Multispectral and hyperspectral imagery gives the power to see as humans (red, green and blue), goldfish (infrared) and bumble bees (ultraviolet). Actually, we can see even more than this as reflected EM radiation to the sensor.
The main difference between multispectral and hyperspectral is the number of bands and how narrow the bands are.
Multispectral imagery generally refers to 3 to 10 bands, normally captured using a remote sensing radiometer.
Multispectral Example: 5 wide bands
Hyperspectral imagery consists of much narrower bands (10-20 nm). A hyperspectral image could have hundreds or thousands of bands, normally captured by an imaging spectrometer.
Hyperspectral Example: Imagine hundreds of narrow bands
An example of a multispectral sensor is Landsat-8. Landsat-8 produces 11 images with the following bands:
- Coastal aerosol in band 1 (0.43-0.45 um)
- Blue in band 2 (0.45-0.51 um)
- Green in band 3 (0.53-0.59 um)
- Red in band 4 (0.64-0.67 um)
- Near infrared NIR in band 5 (0.85-0.88 um)
- Short-wave Infrared SWIR 1 in band 6 (1.57-1.65 um)
- Short-wave Infrared SWIR 2 in band 7 (2.11-2.29 um)
- Panchromatic in band 8 (0.50-0.68 um)
- Cirrus in band 9 (1.36-1.38 um)
- Thermal Infrared TIRS 1 in band 10 (10.60-11.19 um)
- Thermal Infrared TIRS 2 in band 11 (11.50-12.51 um)
Each band has a spatial resolution of 30 meters with the exception of band 8, 10 and 11. While band 8 has a spatial resolution of 15 meters, band 10 and 11 have 100 meter pixel size.
If you’re wondering why there is no 0.88-1.36 band, atmospheric absorption is the main motive why there are no sensors detecting these wavelengths.
The TRW Lewis satellite was meant to be the first hyperspectral satellite system in 1997. Unfortunately, NASA lost contact with it.
But later NASA did have a successful launch mission. The Hyperion imaging spectrometer (part of the EO-1 satellite) is an example of a hyperspectral sensor. For instance, Hyperion produces 30-meter resolution images in 220 spectral bands (0.4-2.5 um).
NASA’s Airborne Visible / Infrared Imaging Spectrometer (AVIRIS) is an example of a hyperspectral airborne sensor. For example, AVIRIS delivers 224 contiguous channels with wavelengths from 0.4-2.5 um.
Multispectral vs hyperspectral
- Multispectral: 3-10 wider bands.
- Hyperspectral: Hundreds of narrow bands.
Multispectral vs Hyperspectral
Having a higher level of spectral detail in hyperspectral images gives better capability to see the unseen. For example, hyperspectral remote sensing distiguished between 3 minerals because of its high spectral resolution. But the multispectral Landsat Thematic Mapper could not distinguish between the 3 minerals. But one of the downfalls is that it adds a level of complexity.
Learn more about multispectral and hyperspectral imaging applications.