At the Global Land Cover Facility at the University of Maryland-College Park, the computers are in training. For the past few months, geographer and remote-sensing scientist John Townshend and his UMD research group, Jeff Masek with NASA Goddard Space Flight Center, and Matt Hansen with South Dakota State University have been testing and retesting the computers’ ability to identify forested areas in Landsat images and report how much they have changed over time.
The training is a warm up for a survey of how forests worldwide changed from 1990–2000 and from 2000–2005. All this testing is necessary, explains Townshend, because the huge amount of data the computers have to go through means the analysis takes an enormous amount of time. To do just one step of the analysis, say locating all tree-covered areas in North America, the computers will run continuously for weeks.
In the 1990s, Townshend was involved in efforts to make continental-scale forest assessments for the tropics using Landsat images. “Following that project,” Townshend said, “we tried to estimate what it would cost if we were to do this for the whole world, and it was always just too much money with technologies available then.”
A big part of getting the images into a usable state is finding cloud-free views of every landscape on Earth and making sure that all the images from different eras line up seamlessly. “The GLS gets rid of a large proportion of that pre-processing, and that is just incredibly valuable,” said Townshend. With those steps out of the way, Townshend and his colleagues can tackle another preliminary step: removing the haziness that the atmosphere causes in all images of Earth from space.
Sometime this fall, Townshend’s computers will have finished their training and will be turned loose on the global land survey datasets for 1990 to 2000. “I’ve been waiting twenty-five years for this,” Townshend says. “I’m not joking.”