With at least one image of every location on Earth per season for 43 years, the Landsat data archive contains more than 50 trillion pixels. Land changes over this time could be tracked using this data, however the computing time to geo-process all of it would take a single computer 15 years to complete. Until now, there was no way to compute the date, but then cloud computing provided an answer.
Since the 1990s, University of Maryland geographers and remote sensing specialists Matthew Hansen and Sam Goward have been mapping changes in Earth’s land cover. “We wanted to know the impact of disturbance—harvesting, thinning, fires, storms—things that lead to changes in forests,” said Goward. “Every time you disturb a forest, it restarts the growth cycle, and when you do that, you impact the carbon cycle. Very few forests make it through a full growth cycle because of disturbances, but no one knows the patterns or how they impact the carbon cycle.”
Goward and Hansen worked with low-resolution data for several years, but disturbance happens on a small scale that demands something like the 30-meter resolution of Landsat satellites. However, researchers had to pay for every Landsat image, and it was simply too expensive to apply the research globally. “We did the science we could afford,” Goward said, “not the science we wanted to do.”
Then in 2008, Landsat data was made freely available on the Internet. “We then knew we could make a global map,” Hansen said, “but we didn’t have the computing power yet.” While attending an international meeting about deforestation and forest disturbance, he was introduced to Rebecca Moore, a computer scientist and mapping researcher at Google. Hansen saw an opportunity. “Their computing expertise fit perfectly with our geographic knowledge. So we ported our code for mapping forests to the Google system.”
In just a few days, Google applied the University of Maryland analysis code to 700,000 Landsat scenes, discarding cloudy pixels and keeping clear pixels. They reviewed the remaining sequence of pixels and assigned a flag to each—was it forested or not? The analysis noted the date that forests were cleared or the date when they had grown-in enough to be counted as forest again. The entire process took one million hours on 10,000 central processing units. Moore noted: “The analysis would have taken 15 years on a single computer.”
The resulting maps created by Hansen and colleagues agree with other research on deforestation that says anywhere from 53 to 72 Teragrams of stored carbon (mostly trees) were removed from the Democratic Republic of the Congo (from 2000–2010). Most of the forest losses were due to cut-and-burn agriculture where small plots of land were cleared for subsistence farming or for the use of wood for fuel.
“In a world of scarce resources, there are distinct tradeoffs in costs and benefits of land use, and whether to conserve or convert forest to cropland,” Bush said. “Map-based images are perhaps one of the most succinct means of helping policymakers digest complex ideas of social and economically driven environmental change.”
Armed with their data-rich maps, Hansen and colleagues would like to someday create a global forest loss alert system. The team is also working to develop tools to distinguish the causes of forest change—such as fire, mechanical removal, disease, storms—from afar.
“We have a globally consistent, locally relevant map product that can be used in a variety of applications: estimating emissions from deforestation, modeling biodiversity, assessing protected areas, and studying forest and human health,” Hansen said. “We plan to move our record forward and backward where Landsat has a sufficiently rich archive of data.”