Flames raced through Northern California last summer, destroying thousands of homes and killing four people. At Reveal, we wanted to tell the story of how three wildfires spread so quickly, so we turned to satellite data.
As different satellites scan the globe, they gather a variety of information. That data gathering, known as remote sensing, regularly is used by scientists, and the data is free and publicly available.
The use of satellite data is a fairly recent tool in most newsrooms.
We used it last year to show how California’s drought affected vegetation health and how that, in turn, affected the state’s wildfire season. For that project, we worked with the MODIS sensor on NASA’s Terra satellite. This time, we turned to Landsat 8, a joint satellite mission between NASA and the U.S. Geological Survey. Both satellites capture imagery from a wide spectrum of light stretching beyond what the eye can see. But Landsat 8 has a much finer spatial resolution than MODIS, which means you can see more details in the images.

Landsat 8 scans Earth every 16 days. We selected imagery from four days from late July through September. These days were chosen primarily based on how much cloud cover obscured the area where fires were burning.
The data is available from the geological survey online for free. Landsat 8 captures multiple bands of light – both what can be seen by the naked eye and what cannot – and each band is stored in its own file, for a total of 12 images. By combining these bands in different ways, we could tell different aspects of the story.
Band | Name | Wavelength (micrometers) | Resolution (meters) | Purpose |
---|---|---|---|---|
1 | Coastal aerosol | 0.43 – 0.45 | 30 | Studies looking at coastal areas or focusing on aerosols such as dust or ash. |
2 | Blue | 0.45 – 0.51 | 30 | Bathymetric mapping. Separating some vegetation types as well as distinguishing soil from vegetation. |
3 | Green | 0.53 – 0.59 | 30 | Vegetation health. |
4 | Red | 0.64 – 0.67 | 30 | Vegetation slopes. |
5 | Near infrared | 0.85 – 0.88 | 30 | Shorelines and biomass. |
6 | Shortwave infrared 1 | 1.57 – 1.65 | 30 | Moisture content of soil and vegetation. Also penetrates thin clouds. |
7 | Shortwave infrared 2 | 2.11 – 2.29 | 30 | Better moisture content analysis and cloud penetration. |
8 | Panchromatic | 0.50 – 0.68 | 15 | Sharper image in the red, green and blue wavelengths. |
9 | Cirrus | 1.36 – 1.38 | 30 | Used to detect cirrus clouds. |
10 | Thermal infrared 1 | 10.60 – 11.19 | 100 resampled to 30 | Heat mapping and soil moisture. |
11 | Thermal infrared 2 | 11.50 – 12.51 | 100 resampled to 30 | Improved heat mapping and soil moisture. |
12 | Quality assurance | NA | 30 | Provides metadata on each pixel. |
Combining the red, green and blue bands allowed us to produce a true-color image of what the land looks like to the naked eye. Adding a panchromatic band helped us increase the resolution of the images because it captures imagery in the red, green and blue spectrums at a resolution of 15 meters per pixel.

To examine vegetation health, we used a second technique that combines imagery captured in the near infrared, red and green bands. This method was developed during World War II to detect camouflage.

Vegetation is shown as red, with higher intensities indicating healthier plant life. Water is shown as black, and buildings or other manmade structures appear as white or tan. By combining this imagery with data showing what type of vegetation was growing where and when the last time fire was in the area, we were able to establish the condition of plant life in the areas where the fires burned.
Lastly, we wanted to look at burn patterns. Wildfires do not burn uniformly, but rather leave scorched ground in a mosaic pattern. A variety of factors play a role in determining where a fire burns, including wind, terrain and vegetation health.
To do this, we combined shortwave infrared with near infrared and green bands to create false-color images. These images show information not visible to the naked eye. Vegetation appears as a vibrant green and burnt areas a bright magenta. Older burns appear in a more muted pink tone. We were even able to pick up spots still burning, which were shown as red and white.

Once we created the imagery, we could see isolated islands of unburned vegetation but did not really understand why those areas had escaped the flames. Our reporting showed that fire typically feeds uphill, but not always. We used data from the geological survey’s National Elevation Dataset to examine the satellite imagery in 3-D to get a better idea of how terrain affected fire spread.
Most of the unburned areas were downhill from the fire spread on slopes generally facing north. North-facing slopes are more resistant to fire because they get less exposure to sun and fire typically does not move downhill.
All of these layers were combined with text in an interactive called How Fire Feeds, which tells the story of the fires. As readers progress through the text, the map changes to follow the story.
Challenges with the data
We originally experimented with processing satellite imagery in QGIS and ArcGIS, mapping software packages with support for image processing. But the process for loading imagery was complicated and took some work to figure out. Also, we were not pleased with how the satellite images looked after being processed.
So we turned to Landsat-util, a command-line tool built by Development Seed. The tool can search, download and process satellite imagery quickly. It also corrects for cloud and snow cover and can handle tasks such as increasing the resolution for true-color images.
Clouds, smoke and haze can make an image appear muddy or dark. We intentionally picked imagery with low amounts of cloud cover and snow was not an issue, but we still had some problems with smoke and clouds. Landsat-util uses information bundled with the satellite imagery to mask areas covered by clouds or smoke before color-correcting images. The end result was a much cleaner image than what we could easily duplicate with mapping software.
We wanted to maintain a consistent look in the images, which spanned more than two months in time. To do so, we had to color-correct the images further after processing them with Landsat-util. Additional factors not accounted for by the software, such as the angle of the sun, can affect how a picture looks.
To solve this, we imported the images into Photoshop to correct the colors after initial processing. We followed the methods outlined by Rob Simmon in this blog post. Photoshop strips out the information needed to display an image on a map, so we had to write a script to reattach it once we were happy with how the image looked.
We learned a lot along the way and hope to use satellite data to help tell other stories.
Eric Sagara can be reached at esagara@cironline.org. Follow him on Twitter: @esagara.